Dying, with a lifetime of literature 

When I was diagnosed with a terminal illness, I didn’t expect the books I taught for 30 years to define how I coped

Dying, with a lifetime of literature 
(Credit: Shutterstock/Penguin/Salon)

When I was diagnosed with a terminal illness, I was determined to not let the disease define me. With the exception of one fundraiser, I declined offers to give talks, blog, or even write a narrative essay about my struggles with the debilitating symptoms of amyotrophic lateral sclerosis (or ALS). While I held to my promise, I hadn’t expected that the classic literature I’d been teaching for 30 years would define how I coped with my illness.

I was diagnosed with ALS in August 2015 — on the Friday before school was to resume. I slipped into my classroom that weekend and filled a box with mementos, leaving behind my personal copies of literary masterpieces and cabinets filled with curricula — at least I thought I was abandoning a lifetime of literature.

At first, my only thoughts of school were met with relief. Relief that I had left my job before greeting 150-plus new students, taking them on a journey of uncertainty and loss; at 17, they had plenty else to worry about.

As the months passed, however, my physical strength waned, and unlike the industrious doer I’d been my whole life, I became dependent upon others. After losing the use of my hands, it became difficult to find meaningful ways to spend my time. I despaired having no control over my life. I strove to focus on the moments of the day when I was warmed by a kind word or an image of natural beauty. When I did pause to appreciate these instances, I’d hear the words “This one is warmed . . .”

At first I was at a loss for the source of the line. I was certain it was from a Toni Morrison novel, but when I consulted Google, I was reminded that the phrase harkened from Morrison’s Nobel Prize acceptance speech. Near the conclusion of her lecture, she tells a brief story about a wagon filled with slaves journeying to a plantation where their lives will end. The driver stops at an inn for a meal, leaving the slaves shivering in the back of the wagon. Two children tend to the slaves, giving them food and sips of warm cider. Morrison sums up this respite from pain and impending hopelessness: “The next stop would be their last, but this one was warmed.” I too was nearly at my last stop — death — but pausing to appreciate the moments that were warmed by small gestures and glimpses of natural beauty dulled the pangs of despair, and I had Morrison to thank for expressing the ineffable emotions that I may have missed had it not been for her words echoing in my mind.

With my mobility limited and my voice diminished, I would often lie in bed and find myself bothered by ridiculous things: a shriveled leaf on a house plant or a crooked lampshade. When someone entered the room to visit, I would seek a way to ask them to correct the irritant. But if they plucked the wrong leaf or didn’t understand me at all, I would usually realize the foolishness of wasting energy on getting my way and somewhere from the recesses of my memory be reminded, “Do not seek to be master of all . . . ”

At first I assumed those words harkened from the New Testament or possibly the humanist Shakespeare, but when Google insisted that Sophocles quilled those lines, I found myself sharing the tragic stage with Oedipus as his brother-in-law Creon admonishes him for failing to learn that fate cannot be circumvented. Oedipus and I both had to learn acceptance. Although acceptance sounds like a passive stance, it would become the hardest work of my life.

Acquiescing to my fate and allowing others to do what they deemed best for my unfamiliar and uncooperative body took patience. But when my confinement to a wheelchair required dismantling my office into a bedroom, replacing my desk with a ramp and my bookcase with a portable commode, I balked. Like Kafka’s Gregor Samsa, who awakened to find himself a giant cockroach, I felt alien in appearance and among unfamiliar surroundings. In an attempt to make me more comfortable, my family, like Gregor’s, had replaced my furniture — my identity — with utility. I slept in a motorized bed with bars and awoke alone, with a green button to summon my husband to untangle my limp legs from the blankets. Despite the love and care I was receiving, nothing except the occasional dream let me pretend that I was myself.

Eventually all my inner turmoil will have to give way to complete surrender. I’m not quite there yet. But I do hear one of Hamlet’s less-famous lines spoken after most of the chaos in the play subsidies: “Let be.” Resonating in those two words is Hamlet’s acceptance: “There is special providence in the fall of a sparrow.” I pray that I may soon die accepting this lesson that’s taken a lifetime to learn.

Lynette Williamson taught high school English and coached debate for 30 years in Sonoma County, California, where she and her husband Don raised two children who had better keep their promise to their mother and produce grandchildren one day. 

Elegy for a Year of Death in America

CULTURE
If Nietzsche was right about “what does not kill me,” we’re stronger now. Facing the darkness is the way forward.

Photo Credit: By The original uploader was Nagelfar at English Wikipedia (Transferred from en.wikipedia to Commons.) [Public domain], via Wikimedia Commons

“Peace, peace!” wrote Percy Shelley in the climactic stanza of his great poem about the death of his friend and rival, John Keats. But Shelley’s poem, “Adonaïs,” is not about peace — rather the opposite. If anything, it’s about the strife and anguish from which human life is never free.

He is not dead, he doth not sleep,
He hath awaken’d from the dream of life;
‘Tis we, who lost in stormy visions, keep
With phantoms an unprofitable strife,
And in mad trance, strike with our spirit’s knife
Invulnerable nothings. We decay
Like corpses in a charnel; fear and grief
Convulse us and consume us day by day,
And cold hopes swarm like worms within our living clay.

This is a form of consolation common to poetry and religion, one much in demand over the past 12 months as we have lost David Bowie, Muhammad AliPrince,Leonard Cohen, Alan Rickman, George Michael, Carrie Fisher and Debbie Reynolds (just off the top of my head) and have suffered the not-entirely-metaphorical death of our democracy, which has been sick far longer than any of those people. If you grew up amid Anglo-American pop culture of the 1970s and ’80s, and if you once held a burnished view of American tradition and American possibility — that describes, I think, a large number of people — this has been a tough year. I can’t promise I can make any of it better, but I can assure you that all the emotions we now feel have been felt before. Maybe that counts for something.

Confronting the mortality of famous people is always a way of confronting our own, I suppose, just as the tales of their marriages and divorces and affairs seem to echo and deepen our own histories of relationship success or failure. If you belong to the micro-generation that assumed that most of the people on that list would always be part of our lives, as I do, then 2016 has offered an especially pungent reminder that there is no such thing as “always,” and that our day is coming sooner than we would like. If your year was also not easy for other, more personal reasons (as mine certainly was), that seems to go with the territory.

As a child, I rushed out to the driveway for the newspaper on the morning after Ali’s big Madison Square Garden fight with Joe Frazier, and was crushed to learn that the mighty hero had fallen. A few years after that, Bowie’s late ’70s records offered me my first glimpse into a realm of bohemian adventure that actually existed, in real life and on the same continent where I lived, and not just in books about the 1920s or the 19th century. Add a few more years, and Prince emerged as the perfect distillation of white and black pop, a symbol of racial and cultural liberation sent to free us from the Reagan years. I didn’t learn to appreciate Cohen’s music until adulthood, when (again, along with many other people) I realized that he was not some folk-rock phenomenon constrained by the ’60s but something closer to a modern-day prophet.

Each of them, like the other people on that list, had a long and complicated life with many conflicting currents, and I won’t even try to do justice to that complexity here. But it did not occur to me that I would live to see them all dead, or that those deaths would all occur in a year that had so many other ways to make us mourn for lost time and lost opportunities, so many ways of reminding us that time is fleeting, and to gather our rosebuds while we may.

I didn’t have the same personal relationships with other people on that list, or with others I haven’t mentioned (Edward Albee or Elie Wiesel or George Martin or Gloria Naylor or Maurice White or Mose Allison — we could go on). But you may, and people each of us knows almost certainly do. Someone close to me was really broken up over Alan Rickman, who was one of the greatest screen and stage actors of our time, and I don’t begrudge anyone, gay or otherwise, for perceiving George Michael as a sui generis figure — a Keatsian figure, if ever there was one — who broke new ground in pop music. (“Listen Without Prejudice Vol. 1” is simply a great record, so great it seemed to have defeated its creator in some ways.)

I don’t want to dwell too much on the perhaps-terminal decline of American democracy, which this publication and everyone else in the media has been worrying over for the last year and a half, like a dog with an old mutton bone. It’s not as if people who supported the incoming president are incapable of grief and sorrow (although I suspect they are underrepresented in the Bowie and Prince fanbases). But for many of us the inexplicable political events of 2016, which remain difficult to believe, even now that they have happened, are at once the atmosphere, the subtext and the inner meaning of all this death. I was not an especially avid supporter of Hillary Clinton, but for many American women (and men) the perverse tale of how she was denied the presidency yet again in her final campaign is another of this year’s great losses. The vision of a woman president came so close to reality, but remains a dream deferred.

We have a way, as human beings, of staring into the darkness and seeing light. We’re going to need that now. In some ways, what Shelley has to tell us in “Adonaïs” is highly conventional: Whatever you believe awaits us on the other side — something or nothing, heaven or hell — at least the struggles of this life are over. Mourning is essentially a form of self-indulgence; it is we who suffer, not the dead. Shelley wrote that poem, of course, while still amid the mad trance of life, locked in unprofitable strife with phantoms: He had one eye on his dead friend and the other on posterity, and was clearly trying to go head to head with John Milton’s “Lycidas,” written nearly two centuries earlier, the first really famous pastoral elegy for a dead friend in the English tradition.

Weep no more, woeful shepherds, weep no more,
For Lycidas, your sorrow, is not dead,
Sunk though he be beneath the wat’ry floor;
So sinks the day-star in the ocean bed,
And yet anon repairs his drooping head,
And tricks his beams, and with new spangled ore
Flames in the forehead of the morning sky:
So Lycidas sunk low, but mounted high
Through the dear might of him that walk’d the waves …

Milton refers us back to Christian redemption as the reason not to feel depressed about death and loss, or at least he thinks he does. (I’m inclined to argue that he invented Romanticism without meaning to, and was constantly at war with his own faith.) But the idea at work here, that light must come out of darkness and hope can be found amid deep personal despair — the belief in literal or allegorical transcendence — is such a cultural constant across literary and religious traditions that it has to mean something. Admittedly, that “something” might just be that biology drives us onward, and those of us who find ourselves still living while others die make up reasons to keep going, because our brains are over-evolved and we can’t help thinking about these things. Cats and beetles, so far as we can tell, don’t ask themselves these questions.

Friedrich Nietzsche’s famous maxim that “what does not kill me makes me stronger” has been repurposed so much by football coaches and military strategists that its original ambiguity has gotten lost. Like most of the mad German’s pronouncements, that one is double-edged and purposefully unclear. Nietzsche knew from experience, for example, that physical illness does not make you stronger in any ordinary sense. (That passage, in fact, comes from “Twilight of the Idols,” his next-to-last major work.) I take his statement to mean that confronting death and mortality directly, as we draw nearer to our own deaths, fortifies us to better use the hours and days we have left.

Nearly everyone I know is coming out of 2016 beset by deep feelings of grief and loss. If we have been made stronger in that sense, we will be more than strong enough for whatever lies ahead: death or transformation, political or cultural or personal. Walt Whitman was thinking of something like this, in a more optimistic key, in perhaps the greatest of his poems, “When Lilacs Last in the Dooryard Bloom’d.” He imagines making friends with death, holding hands with death, and even arriving at “a sacred knowledge of death,” as a way of dealing with the assassination of Abraham Lincoln (named in the poem only as “him I loved”), a grievous loss that did not quite kill America and may, for a while, have made it stronger.

And the streets how their throbbings throbb’d, and the cities pent — lo, then and there,
Falling upon them all and among them all, enveloping me with the rest,
Appear’d the cloud, appear’d the long black trail,
And I knew death, its thought, and the sacred knowledge of death.

Then with the knowledge of death as walking one side of me,
And the thought of death close-walking the other side of me,
And I in the middle as with companions, and as holding the hands of companions,
I fled forth to the hiding receiving night that talks not,
Down to the shores of the water, the path by the swamp in the dimness,
To the solemn shadowy cedars and ghostly pines so still.

 

9 Weird and Wonderful Facts About Death and Funeral Practice

CULTURE
Your ancestors may have shared a coffin.

Photo Credit: SHUTTERSTOCK.COM

It might not be something you want to think about very often, but it turns out that the way we treat our dead in the modern age is heavily influenced by the way our ancestors treated theirs.

When you look at death and funeral practices through the ages, repeated patterns of behaviour emerge, making it easy to see where some of our modern ideas about death – such as keeping an urn on your mantelpiece or having a gravestone – have come from.

So here are nine surprising facts about death and funeral practices through the ages:

1. Some prehistoric societies defleshed the bones

This was done with sharp knives. And we know this because human skeletons buried during this period show the traces of many cut marks to the skulls, limbs and other bones.

During the medieval period, bodies that needed to be transported over long distances for burial were also defleshed – by dismembering the body and boiling the pieces. The bones were then transported, while the soft tissues were buried close to the place of death.

2. Throwing spears at the dead

During the Middle Iron Age, “speared-corpse” burials were a pretty big deal in east Yorkshire. Spears were thrown or placed into the graves of some young men – and in a couple of instances they appear to have been thrown with enough force to pierce the body. It is unclear why this was done, but it may have been a military send-off – similar to the 21-gun salute at modern military funerals.

3. The Romans introduced gravestones

As an imported practice, the first gravestones in Britain were concentrated close to Roman military forts and more urbanised Romano-British settlements.

Back then, gravestones were more frequently dedicated to women and children than Roman soldiers. This was most likely because Roman soldiers were not legally allowed to marry, so monuments to their deceased family members legitimised their relationships in death in a way they couldn’t be in life.

After the end of Roman control in Britain in the fifth century, gravestones fell out of favour and did not become widely popular again until the modern era.

4. The Anglo Saxons preferred urns

During the early Anglo-Saxon period, cremated remains were often kept within the community for some time before burial. We know this because groups of urns were sometimes buried together. Urns were also included in burials of the deceased – who were likely their relatives.

5. Lots of people shared a coffin

During the medieval period, many parish churches had community coffins, which could be borrowed or leased to transport the deceased person from the home to the churchyard. When they arrived at the graveside, the body would be removed from the coffin and buried in a simple shroud.

6. And rosemary wasn’t just for potatoes

Sprigs of rosemary were often carried by people in the funeral procession and cast onto the coffin before burial, much as roses are today. And as an evergreen plant, rosemary was associated with eternal life. As a fragrant herb, it was also often placed inside coffins to conceal any odours that might be emerging from the corpse. This was important because bodies often lay in state for days and sometimes weeks before burial, while preparations were made and mourners travelled to attend the funeral.

7. Touching a murderer could heal

Throughout early modern times, and up until at least the mid 19th century, it was a common belief that the touch of a murderer – executed by hanging – could cure all kinds of illnesses, ranging from cancer and goitres to skin conditions. Afflicted persons would attend executions hoping to receive the “death stroke” of the executed prisoner.

8. There are still many mysteries

For almost a thousand years, during the British Iron Age, archaeologists don’t really know what kinds of funeral practices were being performed across much of Britain. And human remains only appear in a few places – like the burials in east Yorkshire. So for much of Britain, funeral practices are almost invisible. We suspect bodies were either exposed to the elements in a practice known as “excarnation”, or cremated and the ashes scattered.

9. But the living did respect the dead

Across time, people have engaged with past monuments to the dead, and it is common for people to respect older features of the landscape when deciding where to place new burials.

Bronze Age people created new funeral monuments and buried their dead in close proximity to Neolithic funeral monuments. This can be seen in the landscape around Stonehenge, which was created as an ancestral and funeral monument – and is full of Bronze Age burial mounds known as round barrows.

And when the Anglo-Saxons arrived in Britain, they frequently buried their dead close to Bronze and Iron Age monuments. Sometimes they dug into these older monuments and reused them to bury their own dead.

Even today, green burial grounds tend to respect preexisting field boundaries. And in at least one modern cemetery, burials are placed in alignment with medieval “ridge and furrow”. These are the peaks and troughs in the landscape resulting from medieval ploughing.

The Conversation

This article was originally published on The Conversation. Read the original article.

http://www.alternet.org/culture/9-weird-and-wonderful-facts-about-death-and-funeral-practice?akid=15058.265072.uGa8gZ&rd=1&src=newsletter1069675&t=26

How People Died 100+ Years Ago, and How We Die Today

Fascinating data shows life was hell just a century ago.

Photo Credit: myillusion / Shutterstock

An interesting new study compares the leading causes of death today against the leading causes of death in 1901, providing a look at how much our world has changed over the course of a century.

One thing we never learned growing up, while watching “Bonanza” or “Gunsmoke” or “Rawhide,” was how truly fragile life could be in the Old West. If we took our cowboy TV seriously, we might think the leading cause of death was a gunfight, or hanging for horse thievery, or maybe falling off a horse. The truth was much more mundane. The leading cause of death in 1901, almost 59,000 cases, was actually diarrhea, or other intestinal distresses. Not far behind was tuberculosis at 55,000. Pneumonia was third at 48,000.

Today we take medicines like antibiotics or vaccines for granted, but without them our death statistics would likely look very similar to those of 1901. Here’s the rest of the list:

4. Heart disease

5. Bright’s Disease (kidney disease)

6. Congenital disorders (birth defects)

7. Apoplexy (bleeding of internal organs)

8. Unknown causes

9. Premature birth

10. Convulsions

Looking at this list, we can see that none of these causes of death, save for heart and kidney disease, is widespread today. The lack of medical knowledge back then made survivable circumstances fatal.

The leading cause of death in the U.S. today is certainly not diarrhea. (For that, we just down a few tablespoons of Pepto Bismol.) Instead, heart disease has leapfrogged to number one, with 614,000 deaths. Following heart disease is cancer, with 591,000, and chronic lung disease a distant third at 147,000.

The rest of the list looks like this:

4. Accidents

5. Stroke

6. Alzheimer’s disease

7. Diabetes

8. Influenza and pneumonia

9. Kidney disease

10. Suicide

What strikes one most of all from this modern list is that most of these causes of death are age-related. The longer we live, the more likely we are to fall victim to heart disease, cancer, emphysema, stroke or Alzheimer’s. The reason most of these are not on the 1901 list is that the average life expectancy in that era was only about 50 years. People simply didn’t live long enough to get cancer back then. Meanwhile, most of the causes of death on the 1901 list are absent from the modern list because modern medicine has figured out treatments for them.

The modern list is also notable in that the fourth leading cause of death is accident, something missing from the 1901 list. We can speculate that the reasons for this have much to do with technology. Back in 1901, people got from here to there mostly by horse or railroad. Today, motor vehicles take us where we want to go, often at very high speeds. A horse accident was unlikely to cause death, and in fact, it was only the 11th-highest cause of death in 1901, about 550 deaths. Meanwhile over 32,000 people died in 2014 from car accidents alone. The list of deaths by accident in 1901 seems almost quaint today:

1. Heat and sunstroke (lack of AC, perhaps, or working outdoors under the hot sun?)

2. Railroad accidents

3. Drowning

4. Burns and scalds

5. Bone fractures/dislocations

6. Birth injuries

7. Accidental poisonings

8. Gunshot wounds (now there’s a familiar cause of death)

9. Suffocation

10. Poisonous gases (we can speculate from mining, mostly)

11. Horses and other vehicles (very few cars back then)

12. Mine injuries

13. Chronic poisoning (probably workplace substances no one knew would kill people over time)

14. Machinery

15. Hypothermia (think of this next time you complain your radiator is too hot in the winter)

16. Lightning

Breaking down causes of death by age group, these days babies and infants under one year old are not likely to die from diarrhea, as was the case in 1901. In fact, unless a child is born with a fatal birth defect, she is likely to make it past one year of age. In 1901, those who died at age 2 likely had pneumonia as their cause of death; ages 3 to 9, diphtheria; and 10 to 54, tuberculosis—all curable diseases today.

Meanwhile, today at age 2 right up to age 44, having conquered most of the childhood diseases (although anti-vaxxers seem determined to roll back those gains), you are likely to survive unless you are the victim of an accident. In 1901, from age 55 to 79, heart disease was the killer. Today, from age 45 to 64, look out for cancer. Finally, in 1901, if you made it to 80, death came most likely from natural causes as a result of old age, while heart disease is the cause of death in modern-day older people, age 65 and over.

The bottom line? We may look back nostalgically on times past, but that’s mostly because modern science gives us that luxury. If we actually lived back then, we would more than likely have died well before we had a chance to be nostalgic.

Larry Schwartz is a Brooklyn-based freelance writer with a focus on health, science and American history. 

Tech oligarchs are deathly serious about buying off the grim reaper.

4 Ways the One Percent Is Trying to Buy Their Immortality

Photo Credit: Shutterstock.com

Humankind has long dreamed of immortality. Surely, somewhere, Ponce de Leon’s Fountain of Youth awaits, allowing us to escape our inevitable fate of non-existence. Not surprisingly, some very wealthy tech executives are determined to buy their way out of that inevitability. These guys are living the high life and they don’t want it to stop.

Peter Thiel, the founder of PayPal, is worth somewhere north of $2 billion, and a member in good standing of the one-percent club that is projected to control half the planet’s wealth by next year. While economic inequality does not appear to weigh on him, other forms do. “Probably the most extreme form of inequality is between people who are alive and people who are dead,” he told the New Yorker recently. His thoughts on death? “Basically, I’m against it,” he told the Telegraph newspaper.

Oracle founder Larry Ellison has similar deep thoughts. “Death has never made any sense to me.” A billionaire many times over, he has contributed over $400 million to research so far, much of it through his Ellison Medical Foundation, in his quest to live forever. Larry Page and Sergey Brin, Mark Zuckerberg, Sean Parker, Pierre Omidyar of eBay have all placed hefty bets on science, and sometimes pseudo-science, to make sure they stick around a long, long time. “The talent migrating into the field is like nothing I’ve seen in my 40 years in the field,” Ken Dychtwald, a gerontology and longevity expert told Time magazine,  “and they’re convinced there is nothing you can’t do if you can turn biotechnology into information technology,”

The ethics behind the search for immortality are dicey, to say the least. “There will be breakthroughs in the next 15 or 20 years that will have to do with aging itself—actually stopping the biological clock,” Dychtwald told Time. “And I think that really rich people are going to get access to it… Imagine a time when ten thousand really rich people get to live forever, or not have to get dementia.”

Research into life extension costs a lot of money, and it seems reasonable to assume that if there are indeed breakthroughs, those various immortality pills, injections or whatever are going to cost a lot of money, too. Which raises the question, what about the rest of us? If there seems to be a budding confrontation between the economic haves and have-nots, what can be said about a world where there are mortals and immortals? Not to mention the fact that in a world straining to provide resources for 7 billion humans having a normal lifespan, people who never die are going to put a real strain on things.

“Extending the average human lifespan is a great example of something that is individually desirable by almost everyone,” Francis Fukuyama, a political theorist and bioethicist told the Washington Post, “but collectively not a good thing. For evolutionary reasons, there is a good reason why we die when we do.” Without an incentive to adapt in order to survive, said Fukuyama, social change would grind to a halt and dictators could rule for centuries, not decades.

Undaunted, these wealthy tech barons move forward, as confident in their quest to live forever as they were in founding their digital empires. For them, the traditional pace of science moves too slowly. Institutions like the National Institutes of Health, they believe, thrive too much on consensus and are not willing to take the risks that lead to major breakthroughs. “We want to jailbreak them [scientists] from existing research institutions and set them free,” reads the manifesto for Breakout Labs, Peter Thiel’s grant-making group. “It’s those who have an optimism about what can be done that will shape the future…I believe that evolution is a true account of nature but I think we should try to escape it or transcend it in our society,” Thiel told the Post.

Not every fabulously wealthy tech giant agrees with the goals of these immortalists. Bill Gates wrote on the website Reddit that he believes it is egocentric to seek to live forever when so many people on Earth lack basic needs and are dying of curable diseases like malaria and tuberculosis. Gates feels the resources would be better spent helping millions of people live normal lifespans rather than extending the lives of a relatively few multi-millionaires and billionaires.

Here are four prominent proposed pathways to immortality whose research is being supported by the modern tech oligarchy.

1. Avatars and artificial brains. Russian scientist Dmitry Itskov, 31, believes that immortality can be achieved by the year 2045. On his website, 2045.com, Itsky solicits money from the wealthy and in return promises the end of death within 30 years. He states the goal is, “to create technologies enabling the transfer of a individual’s personality to a more advanced non-biological carrier, and extending life, including to the point of immortality.” His plan involves four steps. First, the development of a robot or “avatar” that is linked to our present selves by way of a computer chip. This is to be achieved by 2020. Following this, the human brain will be transplanted into the avatar. The target date for this achievement is 2025. By the year 2035, human consciousness will be transferred or downloaded into an artificial brain within the avatar, replacing the biological brain. And finally, by 2045, consciousness will evolve into a sort of Internet of global consciousness, with no need for a physical presence at all. Robots that have developed their own artificial intelligence will carry out necessary physical tasks. Humanity will have transformed into something else entirely.

If it sounds farfetched, that’s because it is. But the world’s tech sector has bought into the idea that humanity’s knowledge, because of the technology they have developed, will continue to grow exponentially, and what we know now will be dwarfed by what we know in five or 10 or 30 years.

2. Ending death by disease. Brian Singerman, a venture capitalist and partner in the Founders Fund, and his partners, including Peter Thiel and Sean Parker, are placing their faith and their wallets in the hands of biotech companies that are busily studying ways to cheat death. These companies are looking for ways to cure cancer and end the scourge of aging. Says Singerman in Inc. Magazine: “We have a company that’s charged with curing all viral disease, we have a company that’s charged with curing several types of cancer. These are not things that are incremental approaches. It’s all fine and good to have a drug that extends life by a certain amount of months or makes living with a disease easier. That’s not what we’re looking for. We are not looking for incremental change. We are looking for absolute cures in anything we do.”

Singerman believes that within 10 years all viral disease will be curable, and within that same time frame we will have a clearer understanding of what aging is, what causes it, and how to begin to stop it.

3. Genetic tinkering. Human beings are not roundworms. But breakthroughs in human life extension might have started with the lowly roundworm. In 2001, scientists burrowed down to the cellular level and added an extra gene (known as SIR2) to the roundworm. Normally, the roundworm’s lifespan is about two weeks. The roundworms with the extra gene lived for three weeks. This result mimicked the results obtained with calorie restriction, another way scientists have prolonged life in test animals. That may not sound like much, but if scientists manage to add a comparable increase to the normal human lifespan, we’d all live a third longer life. And that’s just the beginning. Ultimately, there are scientists who believe we can genetically turn off the aging processaltogether.

4. Cryogenics. No, Walt Disney is not frozen in some secret vault, waiting for the day when he can lead his Mickey Mouse empire again. But Ted Williams, baseball legend, is. Called cryogenics, the theory is they freeze your body (or, if you are feeling a bit short of change, since the process can cost up to $200,000, just your head), wait until science discovers a cure for what killed you, and then thaw the body out, revive and cure you. Critics of this method point out that at our current level of technology, by freezing the body, we are damaging the cells beyond repair. Imagine, for instance, a bag of frozen strawberries. Once defrosted, the formerly plump firm fruit is now soft and mushy. That is pretty much how the defrosted body would turn out.

But true believers dissent. Alcor, a leading cryogenics firm (and one of the recipients of Peter Thiel’s largesse), states on its website, “We believe medical technology will advance further in coming decades than it has in the past several centuries, enabling it to heal damage at the cellular and molecular levels and to restore full physical and mental health.”

Larry Schwartz is a Brooklyn-based freelance writer with a focus on health, science and American history. 

What Good Is Thinking About Death?

We’re all going to die and we all know it. This can be both a burden and a blessing.

Benjamin Vander Steen / Flickr

In the heart of every parent lives the tightly coiled nightmare that his child will die. It might spring at logical times—when a toddler runs into the street, say—or it might sneak up in quieter moments. The fear is a helpful evolutionary motivation for parents to protect their children, but it’s haunting nonetheless.

The ancient Stoic philosopher Epictetus advised parents to indulge that fear. “What harm is it, just when you are kissing your little child, to say: Tomorrow you will die?” he wrote in his Discourses.

Some might say Epictetus was an asshole. William Irvine thinks he was on to something.

“The Stoics had the insight that the prospect of death can actually make our lives much happier than they would otherwise be,” he says. “You’re supposed to allow yourself to have a flickering thought that someday you’re going to die, and someday the people you love are going to die. I’ve tried it, and it’s incredibly powerful. Well, I am a 21st-century practicing Stoic.”

He’s a little late to the party. Stoicism as a school of philosophy rose to prominence in the 3rd century B.C. in Greece, then migrated to the Roman Empire, and hung around there through the reign of emperor Marcus Aurelius, who died in 180 A.D. “That Stoicism has seen better days is obvious,” Irvine, a professor of philosophy at Wright State University, writes in his book A Guide to the Good Life: The Ancient Art of Stoic Joy. He stumbled across the philosophy when researching a book on Zen Buddhism—“I thought I wanted to be a Zen Buddhist,” he says, “but Stoicism just had a much more rational approach.”

Though the word “stoic” in modern parlance is associated with a lack of feeling, in his book, Irvine argues that the philosophy offers a recipe for happiness, in part by thinking about bad things that might happen to you. The big one, obviously, is death—both yours and that of people you love.

“We can do it on a daily basis, simply by imagining how things can be worse than they are,” he says. “Then when they aren’t that way, isn’t that just wonderful? Isn’t it simply wonderful that I get another day to get this right?”

For Irvine and the Stoics, thoughts of death inspire gratitude. For many others, thinking about The End inspires fear or anxiety. In fact, the latter may be the natural human condition.

* * *

“We are different from other animals in that we are uniquely aware of our own mortality,” says Ken Vail, an assistant professor of psychology at Cleveland State University. “Certainly other animals recognize they can die—if a cheetah chases an impala, or chases us, both us and the impala are going to run away. We recognize that as an immediate threat of mortality. But the impala doesn’t sit in the safety of its office aware of the fact that it will eventually die. And we do.”

This is the price we pay for the nice things consciousness has given us—self-reflection, art, engineering, long-term planning, cooking our food and adding spices to it instead of just chomping raw meat straight off the bones of another animal, etc. We’re all going to die and we all know it.

But we’re not always actively thinking about it. When people are reminded of death, they employ a variety of strategies to cope—not all of which are as well-adjusted as Stoic gratitude. That many kinds of human behavior stem from a fear of death is the basis of one of the most prominent theories in modern social psychology—terror-management theory.

Terror-management theory exists because one day, some 30-odd years ago, Sheldon Solomon was perusing the library at Skidmore College, where he’s a professor of psychology, and he happened to pick up The Birth and Death of Meaning, by Ernest Becker. “This is nothing to be proud of, but the cover is white with green splotches on it, and I was like ‘Ooh, what an interesting color,’” Solomon says. “Then I liked that it was a short book with big print. Again, nothing to be proud of, but true. And that’s why I reached for it.”

Once he opened the book, though, Solomon was taken by its central question—Why do people do what they do?—and how it was presented, without “turgid academic jargon,” he says. Becker offered an answer to that question: People do a lot of the things that they do to quell their fear of death. So Solomon and two of his friends from grad school, Jeff Greenberg and Tom Pyszczynski, set out to test that idea empirically.

* * *

The only antidote to death is immortality. And so, terror-management theory holds, when faced with the idea of death, people turn to things they believe will give them immortality, literal or otherwise. The hope of true immortality can be found in religion’s promises of heaven or reincarnation, or in some of science’s more dubious life-extension promises (Just freeze your dead body! They’ll wake it up later!).

More often though, it’s the hope of symbolic immortality that calms the frightened rabbits of death-fearing hearts—the idea that people are a part of something that will last longer than they do. Their culture, their country, their family, their work. When thinking of death, people cling more intensely to the institutions they’re a part of, and the worldviews they hold.

What that actually means in terms of behavior, is trickier. The research shows that what people do when they’re feeling aware of their mortality depends on the person, the situation she’s in, and whether she’s focusing on death or it’s just in the back of her mind. (The TMT literature, which details a wide range of effects, is now fairly substantial. A 2010 metareview found 238 TMT studies, and this page on the University of Missouri website lists nearly 600, though it doesn’t seem to have been updated since 2012).

When death is in the front of your mind—when you pass by a cemetery, when someone you know is sick (or when, in a lab, a researcher has just asked you about it)—the tendency, according to TMT, is to want to push those thoughts away. You might suppress the thoughts, distract yourself with something else, or comfort yourself with the idea that your death is a long way away, and anyway, you’re definitely going to go to the gym tomorrow.

A couple of studies have shown that conscious thoughts of death do increase health intentions, for exercise and medical screenings, though whether people actually follow through on those intentions is unclear. Promising yourself you’ll eat better may just be a strategy to get death off your mind.

When death is on people’s conscious minds, “they can wield logic to deal with it,” Vail says. “This would be similar to your mom saying, ‘Put on your seatbelt, you don’t want to die.’ So you think about that and recognize, yes, she’s right, you don’t want to bite it on the way to the grocery store, so you put on your seatbelt.”

According to Solomon, even young children use versions of these same strategies. His new book, written with Greenberg and Pyszczynski, The Worm at the Core: On the Role of Death in Life, cites the story of 5-year-old Richard, from a series of interviews the psychologist Sylvia Anthony conducted in the 60s and 70s:

“He swam up and down in his bath [and] he played with the possibility of never dying: ‘I don’t want to be dead, ever; I don’t want to die.’ … After his mother told 5-year-old Richard that he wouldn’t die for a long time, the little boy smiled and said, ‘That’s all right. I’ve been worried, and now I can get happy.’ Then he said he would like to dream about ‘going shopping and buying things.’”

Classic distraction move, Richard. Though at times, our own coping mechanisms may not be much more sophisticated. “Americans are arguably the best in the world at burying existential anxieties under a mound of French fries and a trip to Walmart to save a nickel on a lemon and a flamethrower,” Solomon says.

But shopping excursions can only distract you so much. Even once you stop actively thinking about it, death is still prominent in your nonconscious mind. “One metaphor is the file drawer,” Vail says. “You pull out a file and read it, then you get distracted, now you’re thinking about dinner. You put [the file] back in the drawer, you pull out dinner, now you’re looking it dinner, but whatever you were thinking about previously is now on the top of the file. It’s the closest thing to your conscious awareness.”

This is when, the research shows, people’s attitudes and behaviors are most affected—when you’ve recently been reminded of death, but it’s moved to the back of your mind.

Unfortunately, a lot of what death brings out when it’s sitting at the top of the file drawer is not humanity’s most sterling qualities. If people feel motivated to uphold their own cultures and worldviews in the face of death, it stands to reason that they might be less friendly toward other worldviews and the people who hold them.

The very first terror-management study involved “22 municipal-court judges in Tucson, Arizona,” according to The Worm at the Core. The judges were tasked with setting bail for alleged prostitutes, but first they were asked to take a survey. Some of them just answered personality questions, but some were also asked two questions about death: “Please briefly describe the emotions that the thought of your own death arouses in you,” and “Jot down, as specifically as you can, what you think will happen to you as you physically die, and once you are physically dead.” The standard bail at the time was $50, set by judges who didn’t take the survey. The ones who did take the survey set the bail an average of nine times higher.

“The results showed that the judges who thought about their own mortality reacted by trying to do the right thing as prescribed by their culture,” the book reads. “Accordingly, they upheld the law more vigorously than their colleagues who were not reminded of death.”

But, Solomon says, the researchers later repeated that study with students, and found that only those who thought prostitution was “morally reprehensible” opted to set a harsher bail. The logic goes that those students wanted to uphold their values, and punish transgressors. Since then, more studies have shown this tendency: When mortality’s on their minds, people prefer others in their (cultural/racial/national/religious) group to those outside it. This dynamic has manifested in silly ways—in one study liberals were more likely to make conservatives eat a gross hot sauce after a death reminder and vice versa—and in more serious ones—reminders of mortality have been shown to make peoplemore likely to stereotype others.

While wanting to promote your own worldviews can mean putting others’ down, that isn’t the only way people seek to feel like part of something greater than themselves—searching for that symbolic immortality. Looming mortality can also lead people to help others, donate to charity, and want to invest in caring families and relationships. (And studies have backed up that people do these things when reminded of death.)

These reactions have also been observed outside the lab, after the terrorist attacks of September 11, 2001, when death was likely top of mind for many Americans for quite a while. Comparisons of survey answers before and two months after 9/11 found increases in kindness, love, hope, spirituality, gratitude, leadership, and teamwork, which persisted (though to a slightly lesser degree) 10 months after the attacks. But Solomon, Greenberg, and Pyszczynski point out in their book that there was also a lot of fear and derogation by Americans of the “other” after 9/11, specifically Muslim and Arab others.

“It’s not the case that awareness of mortality and the ensuing terror-management process is an inherently negative one that causes prejudice and closed-mindedness and hostility but instead it appears to be simply rather a neutral process,” Vail says. “It’s one that motivates people to indiscriminately uphold and defend their cultural worldviews.”

How you manage your terror, then, depends on what’s already important to you—and that’s what you’ll turn to when confronted with mortality. In one study, empathetic people were more likely to forgive transgressions after a death reminder; in another, fundamentalist religious people were more compassionate after thinking of their own mortality—but only when compassionate values were framed in a religious context, such as excerpts from the Bible or Koran.

* * *

Terror-management theory contends that there’s something different about our fear of death, compared to other fears. Every other threat is survivable, after all. And in research, thinking about death has produced just as strong of an effect whether the alternative was something neutral, or another threat like rejection or pain. So a fear of death is not just like a fear of rejection, except more.

Except Steven Heine, a professor of psychology at the University of British Columbia, doesn’t think death is necessarily such a unique threat. In 2006, he and fellow researchers Travis Proulx and Kathleen Vohs developed the Meaning Maintenance Model, which says yes, thinking about death can inspire these attitudes and behaviors, but for a different reason. Death, according to their theory, is a threat to the way we understand the world, similar to uncertainty, being rejected by a friend, or even—Heine’s example—finding a red queen of spades in a deck of cards. All these things interrupt what Heine calls “meaning frameworks—understandings of how the world works. When we think about the fact that we’re going to die, it calls all of those assumptions into question. All these things I’m trying to do, I won’t be able to succeed, my relationships will be severed, the way I think I fit into the world, ultimately I no longer will. This is bothersome.”

But perhaps not more bothersome than other threats to meaning. Heine says Meaning Maintenance Model studies have found that thinking about death does not have a noticeably larger effect on people’s attitudes and behaviors than, say, watching a surreal movie. A metareview of TMT studies also notes that the effects of thinking about death are less significant when compared with thinking about something else that threatens someone’s sense of meaning.

Thoughts of death still lead people to uphold their worldviews according to this theory, but it’s because, when faced with an idea as confounding as one’s own mortality, people turn to the other things in their lives that still make sense to them. While the two theories have a lot in common, Heine says MMM can explain one thing that TMT cannot: suicide.

“TMT would argue that while we want to have a sense of meaning as a way of keeping away thoughts of death, one of the key motivators of suicide is feeling that your life isn’t very meaningful, wanting death when you feel like you don’t have sufficient meaning in your life,” he says.

The thing that makes death different, Heine says, is that it’s not solvable. With other meaning threats, you can try to fix the problem, or adjust your worldview to accommodate the new information. “The fact that we’re going to die is a problem that we can never fully resolve throughout our lives,” he says.

But maybe that’s for the best.

* * *

“I know we’re supposed to be super afraid of death. But it’s good, isn’t it?” asks Laura King, curator’s professor of psychological sciences at the University of Missouri, Columbia. “If life never ended, think about it, right? Isn’t that like every vampire story or sci-fi movie? If you live too long, after a while, you just lose it. Life no longer has any meaning, because it’s commonplace.”

King did a study in 2009 that offers an alternative, economical perspective on death and meaning. She showed that after reminders of death, people valued life more highly—and conversely, reading a passage that placed a high monetary value on the human body increased people’s number of death thoughts. This is the scarcity principle, plain and simple—the less you have of something, the more you value it.

But “most of us don’t live like we’re aware that life is a finite commodity,” King says. She describes an exercise she has her students do, in which they write down their life goals, and then write what they’d do if they only had three weeks to live. “Then you say, ‘Why aren’t you doing those things?’ They say, ‘Get real, hello, we have a future to plan for.’”

“Live every day as though it’s your last” is nice but profoundly unhelpful advice, when you know that today is probably not your last day. I’m not sure what I’d do if I was going to die tomorrow—round up all my loved ones and fly them to Paris? Or maybe just throw them a really nice dinner party, the kind where everyone ends up sprawled out on couches, overstuffed and warm from the wine.

Either way, I can’t do that today. I have to go to work.

“Everybody always says life is too short,” King says, “but it’s really long. It’s really, really long.”

Once people’s days truly are numbered, their priorities do seem to shift. According to research done on socioemotional selectivity theory, older people are more present-oriented than younger people, and are more selective in who they spend time with, sticking mostly with family and old, close friends. Other studies have shown them to also be more forgiving, and to care more for others, and less about enhancing themselves.

This all fits in well with Irvine’s Stoic philosophy. Rather than pulling curtains over the darkness on the other side of the window, you stare straight into it, so when you turn away you’re thankful for the light.

Irvine gives the mundane example of buying a lawn mower. “As I’m doing it, I have the realization that this is conceivably the last lawn mower I will ever buy,” he says. “I don’t like mowing the lawn, don’t get me wrong, but I’ve only got X number of times it’s going to happen. Some day, this moment, right now, is going to count as the good old days.”

* * *

Unfortunately, Western culture isn’t exactly death-friendly. Death is kept largely out of sight, out of mind, the details left to hospitals and funeral parlors. Though most Americans say they want to die at home, few actually do—only about 25 percent, according to the Centers for Disease Control and Prevention. Most other people die in hospitals, nursing homes, or other facilities.

This is why, in 2011, the mortician Caitlin Doughty founded The Order of the Good Death, a self-described “group of funeral industry professionals, academics, and artists exploring ways to prepare a death-phobic culture for inevitable mortality.” She’s also written a book about working in a crematory, Smoke Gets in Your Eyes, and hosts the “Ask a Mortician” webseries.

“Death doesn’t go away just because we hide it,” Doughty wrote to me in an email. “Hiding life’s truths doesn’t mean they disappear. It means they are forced into darker parts of our consciousness … Death is the most natural thing in the world, and treating it as deviant isn’t doing our culture any favors … We don’t control nature. We aren’t higher-ranking than nature.”

This is terror management writ large, a culture that pushes death away as best it can. Even though, ultimately, it can’t.

More people are coming around to Doughty’s way of thinking. “Death salons” and “death cafes,” where people gather to talk about their mortality have sprung up across the U.S., and many doctors, like the Being Mortal author Atul Gawande, are working to advance the conversation around end-of-life care, getting patients involved in planning for their deaths.

But the research shows the effects of thinking about death aren’t all grace and gratitude —so would bringing death out into the open ultimately help or hurt humanity?

“At first, thinking about death regularly made me move up and down and way up and way down the emotional spectrum,” Doughty writes. “But over time thinking about death moves you closer to magnanimity. You realize that you will have to give your body, your atoms and molecules, back to the universe when you’re done with them.”

She also points out that TMT studies are isolated instances, and don’t look at what happens when people think about death regularly, over time.

Maybe the key, then, is being deliberate. Not letting thoughts of death sneak up on you, but actively engaging with them, even if it’s hard. In one 2010 study, people who were more mindful were less defensive of their worldviews after being reminded of death, suggesting that “mindfulness can potentially disrupt some of these kinds of processes that go into terror management,” says Vail, the Cleveland State University psychologist.

Solomon, too, is hopeful. “I like to think there comes a moment where sustained efforts to come to terms with death pay off.” Vail suggests that freeing oneself from the psychological reactions to death might get rid of the good effects along with the bad, but Solomon’s willing to take the trade. “If you look at the problems that currently befall humanity—we can’t get along with each other, we’re pissing on the environment, [there’s] rampant economic instability by virtue of mindless conspicuous consumption—they’re all malignant manifestations of death anxiety running amok.”

It’s probably not possible to erase all fear of death—animals have a drive to survive, and we are animals, even with all that consciousness. Even if being mindful about death means getting rid of the good along with the bad consequences of death anxiety, people can be generous and love each other without being scared into it.

“Death destroys a man, but the idea of death saves him,” E.M. Forster once wrote. I don’t know if there’s really any salvation, but if we accept death, maybe we can just live.

http://www.theatlantic.com/health/archive/2015/05/what-good-is-thinking-about-death/394151/

Curing the fear of death

How “tripping out” could change everything

A chemical called “psilocybin” shows remarkable therapeutic promise. Only problem? It comes from magic mushrooms

 

 Curing the fear of death: How "tripping out" could change everything

(Credit: stilikone, Objowl via Shutterstock/Salon)

The second time I ate psychedelic mushrooms I was at a log cabin on a lake in northern Maine, and afterwards I sat in a grove of spruce trees for three and a half hours, saying over and over, “There’s so much to see!”

The mushrooms converted my worldview from an uninspired blur to childlike wonderment at everything I glimpsed. And now, according to recent news, certain cancer patients are having the same experience. The active ingredient in psychedelic mushrooms, psilocybin, is being administered on a trial basis to certain participating cancer patients to help them cope with their terminal diagnosis and enjoy the final months of their lives. The provisional results show remarkable success, with implications that may be much, much bigger.

As Michael Pollan notes in a recent New Yorker piece, this research is still in its early stages. Psychedelic mushrooms are presently classified as a Schedule 1 drug, meaning, from the perspective of our federal government, they have no medical use and are prohibited. But the scientific community is taking some steps that – over time, and after much deliberation – could eventually change that.

Here’s how it works: In a controlled setting, cancer patients receive psilocybin plus coaching to help them make the most of the experience. Then they trip, an experience that puts ordinary life, including their cancer, in a new perspective. And that changed outlook stays with them over time. This last part might seem surprising, but at my desk I keep a picture of the spot where I had my own transcendental experience several years ago; it reminds me that my daily tribulations are not all there is to existence, nor are they what actually matter.

The preliminary research findings are convincing. You could even call them awe-inspiring. In one experiment, an astounding two-thirds of participants said the trip was “among the top five most spiritually significant experiences of their lives.” Pollan describes one cancer patient in detail, a man whose psilocybin session was followed by months that were “the happiest in his life” — even though they were also his last. Said the man’s wife: “[After his trip] it was about being with people, enjoying his sandwich and the walk on the promenade. It was as if we lived a lifetime in a year.”



Which made me do a fist pump for science: Great work, folks. Keep this up! Researchers point out that these studies are small and there’s plenty they don’t know. They also stress the difference between taking psilocybin in a clinical setting — one that’s structured and facilitated by experts — and taking the drug recreationally. (By a lake in Maine, say.) Pollan suggests that the only commonality between the two is the molecules being ingested. My (admittedly anecdotal) experience suggests matters aren’t quite that clear-cut. But even that distinction misses a larger point, which is the potential for this research to help a great many people, with cancer or without, to access a deeper sense of joy in their lives. The awe I felt by that lake in Maine — and the satisfaction and peacefulness that Pollan’s cancer patient felt while eating his sandwich and walking on the promenade — is typically absent from regular life. But that doesn’t mean it has to be.

The growing popularity of mindfulness and meditation suggests that many of us would like to inject a bit more wonder into our lives. As well we should. Not to be a damp towel or anything, but we’re all going to die. “We’re all terminal,” as one researcher said to Pollan. While it’s possible that you’ll live to be 100, and hit every item on your bucket list, life is and always will be uncertain. On any given day, disaster could strike. You could go out for some vigorous exercise and suffer a fatal heart attack, like my dad did. There’s just no way to know.

In the meantime, most of us are caught in the drudgery of to-do lists and unread emails. Responsibility makes us focus on the practical side of things — the rent isn’t going to pay itself, after all — while the force of routine makes it seem like there isn’t anything dazzling to experience anyhow. Even if we’d like to call carpe diem our motto, what we actually do is more along the lines of the quotidian: Work, commute, eat, and nod off to sleep.

With that for a backdrop, it’s not surprising that many of us experience angst about our life’s purpose, not to mention a deep-seated dread over the unavoidable fact of our mortality. It can be a wrenching experience, one that sometimes results in panic attacks or depression. We seek out remedies to ease the discomfort: Some people meditate, others drink. If you seek formal treatment, though, you’ll find that the medical establishment doesn’t necessarily consider existential dread to be a disorder. That’s because it’s normal for us to question our existence and fear our demise. In the case of debilitating angst, though, a doctor is likely to recommend the regimen for generalized anxiety — some combo of therapy and meds.

Both of these can be essential in certain cases, of course; meds tend to facilitate acceptance of the way things are, while therapy can help us, over a long stretch of time, change the things that we can to some degree control. But psychedelics are different from either of these. They seem to open a door to a different way of experiencing life. Pollan quotes one source, a longtime advocate for the therapeutic use of psilocybin, who identifies the drug’s potential for “the betterment of well people.” Psychedelics may help ordinary people, who are wrestling with ordinary angst about death and the meaning of life, to really key into, and treasure, the various experiences of their finite existence.

In other words, psychedelics could possibly help us to be more like kids.

Small children often view the world around them with mystic wonder — pushing aside blades of grass to inspect a tiny bug that’s hidden underneath, or perhaps looking wide-eyed at a bright yellow flower poking through a crack in the sidewalk. (Nothing but a common dandelion, says the adult.) Maybe the best description of psilocybin’s effect is a reversion to that childlike awe at the complexity of the world around us, to the point that we can actually relish our lives.

What’s just as remarkable is that we’re not talking about a drug that needs to be administered on a daily or weekly or even monthly basis in order to be effective. These studies gave psilocybin to cancer patients a single time. Then, for months afterward, or longer, the patients reaped enormous benefit.

(The fact that psychedelics only need to be administered once could actually make it less likely that the research will receive ample funding, because pharmaceutical companies don’t see dollar signs in a drug that’s dispensed so sparingly. But that’s another matter )

Of course, some skepticism may be warranted. Recreational use of psychedelics has been associated with psychotic episodes. That’s a good reason for caution. And a potential criticism here is that psilocybin is doing nothing more than playing a hoax on the brain — a hoax that conjures up a mystical experience and converts us into spellbound kids. You might reasonably ask, “do I even want to wander around awe-struck at a dandelion the same way a 3-year-old might?”

So caution is reasonably advised. But what the research demonstrates is nonetheless remarkable: the way the experience seems to shake something loose in participants’ consciousness, something that lets them see beyond the dull gray of routine, or the grimness of cancer, to the joy in being with loved ones, the sensory pleasure of a good meal, or the astounding pink visuals of the sunset.