10 most haunted places around the world


In honor of Halloween, a look at the spookiest places on Earth — including Japan’s infamous suicide forest



10 most haunted places around the world


This article originally appeared on GlobalPost.

Global Post It’s Halloween, and that means it’s time to get spooky.

Here are 10 of the most haunted places around the world to help you get your spook on.

1) The Beechworth Lunatic Asylum, Australia

(John T Collins/National Library of Australia/Wikimedia Commons)

Abandoned asylums, for my money, are the creepiest places ever, and Australia has a good one. The Beechworth Lunatic Asylum — originally called Mayday Hills Lunatic Asylum — is located in Victoria, Australia. It served as a mental hospital from 1867 until 1995. At its highest capacity, 1,200 patients lived there. About 9,000 patients died in its 130-year history, and there’s little doubt those souls are haunting it this very day. Visitors can take a nighttime ghost tour, to which I say, no thank you.

2) The Princess Theatre, Australia

(Wikimedia Commons)

Elsewhere in Australia, you’ll find the Princess Theatre, which is haunted by a ghost named Frederici. According to lore, Frederick Baker, or “Frederici,” was an Italian baritone singer who died on stage in 1888. He was finishing a performance as Mephistopheles in Faust when a trapdoor dropped beneath his feet and he fell beneath the stage, dying from a heart attack. For many years, the Princess Theater saved an open seat for Frederic at every opening-night performance.

3) The Bhanghar Fort, India

(Himanshu Yogi/Wikimedia Commons)

If you’re not scared yet, head over to India, where you’ll find the Bhanghar Fort in the Alwar district of Rajasthan. The fort was built in 1573 and remains today a ruin of several temples, palaces, and smaller living units.

According to legend, the fort became cursed when a wizard who lived in the town fell in love with the princess of Banghar. Drawing on his skills in black magic, rather than on his interpersonal skills, he tried to woo the princess with a bowl of magic potion. It didn’t work. She figured out the play and threw the bowl against a large boulder. The boulder was disturbed enough to start rolling, and it rolled right in the path of the wizard. As the wizard faced down the boulder, he cursed the town, saying that it would be destroyed and become uninhabitable. He was crushed to death. Soon after, the town was invaded and pillaged. Most of its inhabitants, including the princess, were killed. Those who lived abandoned the fort.

The wizard’s curse remains, of course, and the ghosts of those killed continnue to haunt the fort. The Archaeological Survey of India, which manages the site, forbids anyone from staying at the fort after dark.

4) Aokigahara, Japan

(Wikimedia Commons)

If you’ve ever visited this haunted place, you’re way braver than I am. At the base of Mt. Fugi, you’ll find Aokigahara, Japan’s globally infamous Suicide Forest. Hundreds of people have journeyed into the forest to kill themselves amidst its dense trees and vines, so many people that the local police do annual sweeps to clear away the bodies. They no longer publicize the number of bodies discovered, out of fear that those numbers actually encourage suicides. In 2004, 108 people committed suicide there. Signs around the forest placed by local police plead with suicidal visitors to reconsider: “Your life is a precious gift to your parents” and “Please consult with the police before you decide to die.”

Understandably, many people believe that the forest is haunted by the souls of those who have died there. Others point to a different haunting origin, though. According to one legend, during times of famine in ancient Japan, families couldn’t feed themselves. Some would be abandoned in Aokigahara, where they died of starvation. Those ghosts haunt the forest today, of course.

It’s an all around terrifying place.

5) Iulia Hasdeu Castle, Romania

(Constantin Barbu/Wikimedia Commons)

The Iulia Hasdeu Castle was built by Bogdan Petriceicu Hasdeu in Campina, Romania after the death of his 19-year-old daughter, Iulia. Hasdeu dedicated the castle and the rest of his life to lulia. He became a practitioner of spiritualism in an attempt to reconnect with her spirit, and designed one room in the castle solely for the purposes of these daily spiritual exercises. Its walls are all black. Iulia reportedly haunts the castle still, walking through the courtyard in a white dress and holding daisies. Oh, and she still plays the piano each night.

6) Hell Fire Club on Montpelier Hill, Ireland

(Joe King/Wikimedia Commons)

The Hell Fire Club on Montpelier Hill was built as a hunting lodge in 1725 and reportedly became a gathering place for a small group of Dublin elites who met for debauchery and devil worship.

(Joe King/Wikimedia Commons)

Tales of animal sacrifice, black masses, cloven-hoofed men, and murder surround the structure. It’s another popular destination for tourists and ghost tours.

7) Manila Film Center, Philippines

(Michael Francis McCarthy/Wikimedia Commons)

This one doesn’t look like your typical haunted castle, creepy forest, or old ruin, but its story is sad and terrifying.The Manila Film Center is reportedly haunted by the ghosts of workers killed during a tragic construction accident. At 3 a.m. on Nov. 17, 1981, scaffolding at the site collapsed burying about 169 workers in quick-drying cement. No rescue teams were allowed at the site for nine hours. Reports differ on just how many workers were killed, but it’s possible that several bodies remain entombed in the structure.

8) Dragsholm Slot, Denmark

(Niels Elgaard Larsen/Wikimedia Commons)

Back to haunted castles: Dragsholm Slot, or Dragsholm Castle in Denmark. The original castle was built in 1215. In the 16th and 17th century, parts of it were used to house prisoners of noble or ecclesiastical rank. It was rebuilt in a Baroque style after 1694, and is thought to house at least three ghosts: a grey lady, a white lady, and the ghost of one its prisoners, James Hepburn, the 4th Earl of Bothwell.

9) Raynham Hall, United Kingdom

(Nigel Jones/Wikimedia Commons)

The are lots of haunted places in the United Kingdom. The most famous is the Tower of London, but that’s kind of played out, so here’s a slightly less famous haunted spot: Raynham Hall in Norfolk, which is haunted by the “Brown Lady,” so named because she appears wearing a brown brocade dress.

The Brown Lady is thought to be the ghost of Lady Dorothy Walpole (1686-1726). The sister of Robert Walpole (the first prime minister of Great Britain), she allegedly had an affair with a local lord, Lord Wharton. According to one story, her husband, Charles Townshend discovered the affair and locked her in their home at Raynham Hall. Another story claims that it was Lord Wharton’s wife who somehow managed to arrange her entrapment. Either way, Dorothy was locked up. She died, and her soul was freed to haunt the castle.

The Brown Lady has been spotted many times, first in 1825, when guests at a Raynham Hall Christmas party retired to their rooms. The most recent sighting was Sept. 19, 1936, when a photographer for Country Life magazine snapped an iconic photo of her. It appeared in Country Life and then again in Life magazine. It was probably a smudge on a lens or a double-exposure. Or maybe not. Either way, the Brown Lady became famous.

10) Château de Châteaubriant, France

(Wikimedia Commons)

We’ll end with another story of a woman locked in a castle. This one comes from France. TheChâteau de Châteaubriant was built in the 11th century. The haunting dates to the 16th century, and the story of Jean de Laval and his wife Françoise de Foix. King Francis I asked de Laval to assist him at court, and Françoise joined him there, becoming the lady in waiting to the queen. She also became King Francis’ mistress. She died on Oct. 16, 1537 under mysterious circumstances. It was rumored that de Laval had learned of her affair and locked her in a room until he could poison her. Now, every year, on Oct. 16, Françoise walks the halls of the Château.


My very un-American Halloween


Months after my dad died, I want to return to the holiday’s roots, in which loss is celebrated as a part of life

My very un-American Halloween

(Credit: Oliver Sved via Shutterstock)


I stood in the Atlantic Ocean near my father’s house at his memorial service, and helped to scatter his ashes into the ocean. I dug my hands into the urn, grabbing handfuls of the gritty ash, and threw it into the incoming waves, releasing my father into the place that he had loved best. I still remember the sensation of having my father’s ashes underneath my fingernails — the underside of my fingernails clogged with parts of my father’s body. How, in that moment, I considered how I could hold onto him, but then, aware of how unrealistic I was being, I plunged my hands into the ocean and felt the traces of him wash away.

My dad died on June 13 of pancreatitis. He was 71 years old. My father’s death was ugly — he didn’t slip into oblivion, but was, until the last hours of his life, in torturous pain that I still feel his doctors undermedicated. There was nothing redeeming about his end. I find myself embarrassed to admit how hard this process has been. On some days, I explore each facet of his end as if I were a jeweler examining a diamond, and on other days, I feel robbed of language, so unable to take in my dad’s death that I hide old photographs.

How odd, then, that Halloween is upon us. It’s a holiday when people all over the world remember and celebrate those who have passed, but how many Americans actually know that Nov. 1 and 2 are All Saints Day and All Souls Day? In typical U.S. fashion, Halloween bears little resemblance to what it once meant, just as the Easter Bunny has little to do with crucifixions. When it comes to our holidays, we’ll take cute and funny over dead serious any day.

Don’t get me wrong: I’ve always loved Halloween. In my household, sweets were an expression of love. How could I not adore a holiday in which strangers loaded my bag with milky chocolate, sweet lollipops, gum? My father forbade us to eat anything until he had inspected it, which often meant that he needed to “taste test” his favorite treats. How funny the joke was — that my dad was a big kid, too, and our chances of eating a Heath bar or a Peppermint Patty were nonexistent.

But I can’t help feeling that our inability to acknowledge death in our culture, to really acknowledge the loss that a single death brings to a family, is somehow symbolized by turning a once-sacred holiday into a romp. Halloween is either a kids’ celebration, where children dress in costumes and get drunk on candy, or an excuse to throw elaborate parties where grown-ups dress up and get drunk on alcohol, all in the name of scaring ourselves silly with the idea of ghosts and goblins, witches and vampires. Nowhere, it seems to me, is there an acknowledgment that Halloween is a night about death. Not scary Hollywood death, but the simple losses that families suffer, as mine did this year.

Four months after losing my dad, I have learned to put together good days where I accept that my dad is gone and that, even though I don’t believe in an afterlife, he is in a better place if only because he is not suffering.  But four months is not four months when you are in a state of mourning. Time oozes. It stretches.

Still, I feel pressure to be “over” it. To find “closure,” which as best as I can tell, is supposed to happen between the death and the funeral. Grieving people are a nuisance to an industrial economy; in this country, we only give ourselves so many days to recover from any trauma. (As we come up on the year anniversary of Sandy Hook, how many people want to acknowledge how much grief and suffering still lingers in that town? How far from “closure” those families must be?) Better to do is to stiffen our upper lip, put our shoulder to the millstone, and move along. I quote clichés because our language for getting on with our lives after a loved one’s passing are clichéd. We don’t have a language for profound everyday sadness. When we grieve, we grieve alone. When we say to others, “Hi. How are you?” we don’t want anyone to stray from the script and tell us the truth. I have learned the fine art of saying “I’m fine” and plastering a smile upon my face. I don’t want to cause anyone any discomfort. That would be rude. I have found myself toying with the idea of wearing a black armband. Something that says to the rest of the world, “I’m not my normal self.”

My next door neighbor is a retired anthropologist who studied the indigenous peoples of Peru. I spoke to her a few weeks ago, on one of the bad days. My neighbor reminded me that it’s the American way of death that is alienating and insanity-making, not that I was insane. She told me that in Peru, she accompanied families to the local graveyards, where they had tended to their dead. Sometimes, parts of bodies would work themselves to the surface of the soil. The people who were tending the graves would greet their ancestors, speak to them, rebury them, and treat it as a natural part of life. The dead were always with them: they were not shunted away. Our sanitized forms of body disposal create as much distance between the living and the dead as possible. I can’t imagine a pristine American cemetery littered with body parts.

I understand some people think this is morbid. That while it’s OK on Halloween to scare ourselves silly with thoughts of the evil dead, getting our hands dirty by handling the stray body part, or to hold, as families do in places like Mexico, a full picnic on the grave of an ancestor in order to acknowledge that dead people are still a part of the family — that that represents something not proper. Maybe even primitive. We are too civilized to truck with the dead. But, of course, we are also afraid of death. We’ve built entire religions that guarantee us an afterlife because we’re so afraid of the nothingness that lays beyond our last breath. And we’ve built entire cultural genres on the idea that the dead hold nothing but maliciousness toward us, the living. Casper the Friendly Ghost is a children’s cartoon. If we’re adults, we much prefer the horror that is currently playing at your local multiplex.

My biggest fear of death is not the dying itself but that once I am gone, my father and my grandparents will die, too. There will be no one to remember them. Their stories will go with me.

So this year, I have new plans for Halloween. I will still sit out on my front porch, bowl of candy by my side, as I distribute treats to the neighborhood kids, as it should be. I will marvel at their costumes, smile at their moms and dads, load their bags up with what I hope is the good stuff.

But then, for the next two days, I will take time out to publicly acknowledge the loss of my father. This doesn’t mean I will wear sackcloth and ashes — my father would have hated that — but it does mean that I will adopt some of the rituals from other cultures to help me deal with my grief.

There is no grave to sit by, no place to sit by my father’s body and tell him how much I miss him. If the Gulf Stream has, pushed by the ocean’s tide,  caught his ashes, he should be halfway to England by now. But we have planted a blue flowering tree in the front yard in remembrance of him, a thousand miles away from where we scattered him. On Friday, I will cook one of my father’s favorite meals to serve to my family. And for my dad, I will leave a can of Boddington’s beer by the tree. I will burn a candle. And I will sit. Talking. Telling him all that he has missed since he died. Tell him how much I miss him, but how my life, and that of those who loved him, have progressed since he’s been gone. Rather than primitive, it seems the civilized thing to do.

Lorraine Berry is an associate editor at Talking Writing. Follow her on Twitter: @BerryFLW

Día de los muertos

Day of the Dead  


A Pre-Columbian Meso-American Observance of Death and Life

       El Día de los muertos or Día de los difuntos (Day of the deceased) or Día de los finados (Day of the departed)  is a celebration of both Life and Death. Historically, it has been a fusion of both Judeo-Christian ideologies and a synthesis of the indigenous-American vision of Death and the hereafter. Such pre-Columbian cultures that honored and revered their departed loved ones, beckoning and summoning the return of those bygone spirits who have passed on, include the Aztecs, Maya-Quiché, Toltecs, Purépechas, Olmecas, Zapotecas, Tlaxcaltecas, and the Mixtecas, all of México, the Yucatán Peninsula and parts of Central América. For when the Catholic Fathers or padres católicos arrived in the New World in the 16th Century in their madness for a golden El Dorado and erected their missions in an attempt of convert and evangelize the “heathen” Meso-American natives, the church eventually had to succumb to the idea of having to absorb these “Indian” traditions into their newly founded society. Whether the Spanish conquistador Hernán Cortés, who arrived in Nueva España in 1519, proved to be the reincarnation of the Toltec god-king Quetzalcoatl (the “Plumed or Feathered Serpent”, literally “the twin of Quetzal“) or whether the fairer-skinned Spaniards’ arrival was independent of such a superstition, the Aztecs’ collective fear of the god’s fabled return became a three-dimensional reality in the form of the Conquest of México. And while seeing an unforeseen stray comet in the year 1517 A. D. and having taken it as a bad omen and harbinger of disaster, the Aztec emperor Moctezuma promptly had his astrologers executed for their not having predicted the spectacular event. This comet, thought to be the same one predicted by British astronomer Edmund Halley centuries later, predated the arrival of the Spaniards by only two years.


    In the indigenous, aboriginal perspective on Death, both life and death are mere aspects of a common duality or eternal cycle, as denoted in the following “Indian” poem from North America:


Do not stand at my grave and weep.
I am not there, I do not sleep.
I am a thousand winds that blow.
I am the diamond glints on the snow.
I am the sunlight on the ripened grain.
I am the gentle Autumn’s rain.

When you awaken in the morning hush,
I am the swift uplifting rush,
of quiet birds in circled flight.
I am the soft stars that shine at night.
Do not stand at my grave and cry:
I am not there, I did not die.

   So, both life and death are viewed much like heads and tails of a coin: they wouldn’t be complete without each other, nor would be the coin. Thus, to celebrate life is to celebrate death as well, for all living things must be born, mature, oftentimes reproduce, and die, be they plant or animal. Life and its ultimate fate Death are both part of the eternal, rhythmic cycles of Nature and consequently are intertwined in inseparable co-existence. For Death is simply an encuentro or a re-encounter, not a total disappearance. On the Day of the Dead or Día de los muertos, those relatives and family members who are left behind in the physical world of transient phenomena return to the graveyard to visit their defunct loved ones and friends, the muertos, who have traveled to the other side of Infinite Creation. So death is but our constant companion, an incessant shadow, a part that pursues us daily, a reflection we cannot shake off, a step forth into another, higher sphere of alternate reality. Though honoring of deceased ancestors who have roamed the Earth before us is to be a festive occasion for those of Meso-American descent and who honor the holiday, it is also a time of sadness, of mourning, of grieving, of quiet contemplation and remembrance, of making an ofrenda or offering, of creating an altar out of respect, or of praying for safe guidance for a departed or troubled soul into the realms beyond, into el mundo del más allá. The Day of the Dead shares a common autumnal “seasonal fusion” with North America’s Hallowe’en, yet the two observances are indeed very separate, unrelated holidays.

November 1st, in México and in keeping with the pre-Columbian perspective, is considered the Día de los angelitos (Day of the Little Angels) for those children who have left the Earth from 12 years of age or younger. Since pre-puberty is synonymous with pure innocence, for this reason, these departed souls are denominated “little angels”, who are those who had experienced a shortened earthly existence. This day was declared All Saint’s Day or Día de todos los santos by the Christian churches. November 2nd is the Day of the Dead for the departed spirits who passed on beyond the age of 12 while on earth and through adulthood and into elderly age. This was proclaimed All Souls Day. All Saints Day became a day reserved on the church calendar to pay homage to all of the saints who otherwise do not have a feast day or fiesta set aside in their honor.

Now when the Náhuatl-speaking Aztecs arrived in the valle de Mexica (pronounced “may-she-ka”) around the 12th Century where they built their Venice-styled, canaled  metropolis of floating gardens Tenochtitlán (ultimately becoming present-day Mexico City), they found scattered throughout the valley zenpazúchitles or marigold flowers, and somehow these Uto-Aztecan people came to associate these lovely decorations of Creation with death itself. Possibly, the strong aromatic scent of these flowers attracted those bygone souls who dwelt in the hereafter. Historians of the Americas are still unclear as to why or to what symbolic connection these brightly yellow- and orange-colored flowers held for the Aztecs. Along with these marigold flores de la muerte, the Aztecs and many other pre-Columbian indigenous peoples burned copal, a tree resin and incense to honor and conjure up the dead who are buried in the camposanto or holy ground.

In certain parts of México today, and especially in the southern states of Oaxaca and Quintana Roo, the Día de los muertos observance is often honored by making an altar in one’s home, replete with photos of the departed one(s), accompanied by the burning of candles and incense, displaying marigolds, máscaras or masks signifying an alternate persona, and much more. Visiting the gravesite is yet another time-honored custom and is indeed very popular in Texas, New Mexico,  and other parts of the American Southwest. Pan de muerto (dead bread) and calaveras de azúcar (sugar skulls) are also a common, almost necessary part of the collective symbolism in the familial celebration of recall and visitation. The esqueletos (skeletons) always have smiling skull faces denoting that those who have crossed over to the other side of Life are, ironically, happy to be free from the exigencies and demands of this earthly realm, free from the pain and suffering it encompasses: for  those left behind must endure a carnal physical existence, subject to the cumbersome laws of terrestrial gravity and spatial electromagnetism. It is widely accepted that the souls of the departed ones return to visit their living relatives. Hence, the living leave out food, candy, tequila, beer, cigarettes, and whatever other items the departed soul liked to consume when s/he was among the living. Since these visiting spirits are no longer of this world, they cannot remove these items left out on the alters or graves by physical means alone. Only the sensory impressions of these foods and drink and their lingering delights can be assimilated by them into their world of the Beyond, consuming the flavors through extrasensory, non-physical means only.

Día de los muertos is celebrated from el primero de noviembre (November 1st) thru el dos de noviembre (November 2nd). Oftentimes, Mexican families camp out in cemeteries for up to three full days, as family members pray, bring food or goodies, sing, or burn incense. It is known that:

     * much preparation goes into growing and gathering zenpzúchitles, the flowers of death;
* bread of the dead made in human shapes and sugar skulls made of fruit are popular;
* copal incense is burned on the altar through the midnight hours in séance-like fashion;
* writing of sátiras or comical poems called Calaveras (“skulls”) serve to lighten the grief;
*gravesites are cleared and cleaned annually by the surviving family members;
*mano de león or “Lion’s paw” flowers are used to decorate the altares and tumbas;
*nubes or white carnations decorate the graves of departed children, signifying purity;
*tallow candles or velas are burned, one representing each departed loved one.

Come nightfall:

     * families walk to the cemetery and visit the tombs of their loved ones in honor of them;
*they feast or dine and drink coffee at the gravesite;
*they leave the holy ground or camposanto to go home and throw a party or fiestecita;
*they attend mass given by a Catholic priest at daybreak;
*they return home to have sugar skulls for dessert and to tell fond stories of the deceased;
*fireworks displays often illuminate the path for the returning spirits of the angelitos.


   In our current states of Arizona, California, Colorado, New Mexico, and in the popular Tex-Mex tradition of Texas, artists, painters, and sculptors have honored the Día de los muertos in such colorful array, especially with elaborately displayed altares muy decorados. Much Mexican-American or chicana literature pays tribute to the traditional Holy Days and, more importantly, to the Aztec and indigenous visión de la muerte or viewpoint on Death. The literary work The Road to Tamazunchale, by Ron Arias, provides the reader with such a distinction in outlook, yet it’s riddled with wit, charm, and humor. For in Anglo-American culture, death is somewhat taboo, a tragedy to someday contend with, a topic not to be discussed much, and Hallowe’en is a very light-hearted treatment of our ultimate kismet and final mortal resolution. Mexican children, in most parts, are raised with the idea of death constantly being around them, as a part of Life itself, and their skull shaped candies, toy coffins, and skeletons reflect their carpe diem (capture the day) consciousness as is represented in much of Mexican and Central American society to this day. For if Death is but a pursuant perennial shadow about us, then we must savor each moment and enjoy every hour during our lifetimes, before our earthly existence slips away and we’re henceforth consumed by the eternal shadow of death


   Many Aztec words have been imported into both the English and Spanish languages due to the Aztecs’ former geographical presence in Meso-America: xocolatl (“bitter water”) or chocolate (the royal drink of Aztec emperors like Cuitláhauc and Moctezuma), coyote (literally “singing dog”), adobe (dried mud brick), abarrote (retail grocery store), ayote, (pumpkin), chicle (chewing gum), ceviche (fish marinated in lemon juice), cilantro (coriander), cacahuete (peanut), atole (a cornflour gruel mixed with rice, cinnamon, and milk), champurrado (a hot chocolate drink thickened with cornflour), pulque (a fermented agave juice), chicha (a strong drink made from fermented maize), jitomate (tomato), frijol (kidney-bean), chilchote (a very hot chili pepper), chilatole (an ear of corn cooked with chili and pork meat), capirotada (bread pudding), cacalote (raven), enchilada (a rolled pancake of corn maize with chili, stuffed with cheese), rompope (an eggnog-like drink extract mixed with cream and alcohol), chayote (a pear-shaped, single-seeded fruit of the gourd family), camote (sweet potato), guajolote (turkey), piñata (a papier-mâché figurine hung at parties, filled with candies), sarape (a quilt), metate (grinding stone), huarache (sandal), molcajete (mortar), poncho (a blanket-like pullover worn over the shoulders), chirimoya (the tree), chaparral (shrub), chimichanga (fried tortilla stuffed with meat), aguacate, (avacado, the vegetable), chapultepec (grasshopper), tecolote (owl), zopilote (vulture), mitote (gossip), cuate, (pal), mellizo (fraternal twin), tortilla (a cornmeal pancake), tamal (masa or corn maize rolled and stuffed with meat or fruit), papalote (kite), comal (a terra cotta bowl), popote (straw), pozole (a hot stew), chorizo (Mexican sausage), horchata (a rice water drink made from earth nuts), chipotle (a cactus sauce), achiote (the arnotto-tree), jalapeño (a hot, dark green pepper), maguëy (the American agave plant used in making Tequila), tomate (the fruit), elote (corn), mole (a spicy sauce made from cacao and chili peppers served with chicken), guacamole (a seasoned sauce of various condiments made from whipped avocado), chicalote (a prickly poppy flower), chili (a hot pepper), chilaquil (a maize omelet stuffed with cheese, herbs, and chili sauce), Chicano (Mexican-American), and even the Mexican seaside resort of Mazatlán (the place of the deer). Even the quetzal is become the national bird and monetary unit of Guatemala. 



    The arrival of Hernán Cortés (1485-1547) and his troops to the land of the Aztecs in the early 16th Century brought military atrocities, harsh tributes, torture to innocent men, women and children, and epidemiological disaster to the native inhabitants of México. Tragically, when the Spaniard conquistadores invaded and destroyed the Aztec capital of Tenochtitlán beginning on Good Friday of 1519 through the summer of 1520 A.D., they had begun a systematic, desolate destruction and demolition of a highly advanced civilization that had already acquired accurate astronomical observations and mathematical equations on planetary motion, long before the time of Nicolaus Copernicus (1473-1543) and Galileo Galilei (1564-1642). Petroglyphs on archeoastronomy, data on both solar and lunar eclipses, scientific treatises on botany and biology, as well as their beautiful floating gardens that rivaled those of ancient Babylonia were all burned or destroyed in their impulsive, manic irreligious ignorance, arrogance, religious bigotry, and greed. Among the many factors that contributed to the demise of these Uto-Aztecan peoples were:

 (1) gunpowder and superior firearms never before encountered by the indigenous peoples
(2) the promising legendary myth of El Dorado and its relentless pursuit
  (3) Hernán Cortés’s supposèd physical (bearded) resemblance to the fabled Toltecan god-king Quetzal
  (4) the Spaniards’ importation of horses from Europe to the Américas as a means of transportation
(5) the Tlaxcaltecas, Zapotecas, and Olmecas having joined forces against Moctezuma
(6) pandemic plagues of smallpox and influenza brought to the Américas by European settlers, killing thousands

     Tragically, the beginning of the end for the indigenous populations of México, Central and South América in the 17th and 18th Centuries had set in: rampant, uncontrolled plagues of smallpox, influenza, and pneumonia began to spread, devastating whole communities in their wake. The indígenas had no prior experience with such diseases. No cacique was familiar with the spread of germs. No brujos or bewitching healers, shamans, or curanderos could possibly offer any cure or cosmic solution from such contamination. Their gods, their cosmovisión or religious world view, had provided little comfort either. Somehow, as if by some magic of unflappable Divine intervention, the newly arrived colonista settlers somehow evaded the curse, for the most part, with their built-in strong immunities.


    In many parts of the state of Chihuahua and throughout Baja California and Northern México, Halloween or La noche de brujas is rapidly becoming assimilated into Hispanic society, as youngsters are beginning to disguise themselves and go trick-or-treating. Conversely, throughout much of the Spanish American Southwestern states, in the so-called “bilingüal belt”, Halloween is starting to be merged with the pre-Columbian Old World traditions of the Día de los muertos. And Anglo-Americans are beginning to honor the Día de los muertos in steadily increasing numbers, also. In mutual “border crossings”, pumpkins are beginning to adorn gravesites and ofrendas at the local panteón, as is amply displayed in San Fernando Cemetery in San Antonio, Texas. Papier-mâché witches and ghosts accompany skeletons and grinning calaveras on many a tomb. And Hallowe’en treats and candies such as Snickers and Mars Bars form part of a festive fanfare alongside of panes de muertoscalaveras de azúcar, el ataúd de chocolate (chocolate coffins), and calacas, small handmade moveable clay-and-wire skeleton figurines that depict a vivacious afterlife. 

Yet an overemphasis on the continuities of the Pre-Columbian past can easily elide the fact that there are also striking similarities between the rituals and customs of the Day of the Dead and the early modern observance of All Souls Day in Europe. Throughout Southern Spain, seasonal sweets called panellets dels morts (bread for the dead) sell on All Saints Day.  A further variety of other confections and cakes pay homage to the deceased in such places as the Azores, Portugal, Catalonia, Sardinia, and in Haute-Saône, France. In both traditions and on both continents on either side of the Atlantic Ocean, death itself is, at least for a few days, confronted and colorfully revivified. 

So during this joyous, carnivalesque celebration, Mexicans embrace the concept of death (including that of their departed friends and family) with true gusto, satirical humor, and with affectionate mockery. As film-maker Sergei Eisenstein observed most acutely and perceptively:

 “In México, the paths of life and death intersect in a visual way, as they do nowhere else; this meeting is inherent both in the tragic image of death trampling on life, and with the sumptuous image of life triumphing over death.”

    And as Nobel laureate Octavio Paz remarks in his prolific work The Labyrinth of Solitude or El laberinto de la soledad, the Day of the Dead affirms “the nothingness and insignificance of human existence”, adding that many modern Mexicans joke about death as they “caress it, sleep with it, and celebrate it” while they “look at it, face to face, with impatience, distain or irony.”

And all our yesterdays have lighted fools
the way to dusty death.

from William Shakespeare’s MacBeth

Tis the undiscover’d country from whose bourn
No traveller returns
on the subject of death, from William Shakespeare’s Hamlet

“Solo el ser que no nace
No puede ser calavera.”

“Out! Out! Brief candle
Life is but a walking shadow
A poor player that struts and frets his hour upon the stage
And then is heard no more.
It is a tale told by an idiot
Full of sound and fury
Signifying nothing.”

from William Shakespeare’s MacBeth

“La muerte es uno de sus juegos predilectos
y es su amor más permanente.”

Octavio Paz (1914-1998)

“There is no death, only change.”
        Paramahansa Yogananda (1893-1952)



  Los aztecas florecieron durante el siglo XV y dominaban la región central del México prehispánico. Su capital era Tenochtitlán. El calendario de ellos es en realidad olmeca, desarrollado por esta cultura hace milenios. Pero como todos los calendarios que expresan la rotación de la tierra alrededor del sol en base al día terrestre (y no al día solar), es inexacto. A través de los siglos fue modificado para hacerlo cada vez aún más preciso. La última corrección fue realizada en Huehuetlapallan (se presume Xochicalco) y con ella llegó a la cifra de 365.25 días por año. Los aztecas también eran muy religiosos y construyeron templos para honrar a sus varios dioces. Realizaron sacrificios humanos en honor de sus dioces.
Los olmecas son considerados como el primer grupo indígena de mayor importancia en la región del Valle de Mexica. Su cultura data de los años 1500 a.C. Construyeron centros cermoniales. Son aún más conocidos por sus cabezas colosales.
La civilización maya floreció entre los años 200 y 900 d.C. Tenían una cultura bastante avanzada. Construyeron templos grandes e inventaron un sistema complejo de escritura con jeroglíficos quichés que representaron su vida y su historia. Sabían matemáticas y astronomía e hicieron cálculos bastante complicados para desarrollar su calendario y para pronosticar con precisión eclipses solares y lunares. 

We are but insects
Living out our short and pointless life;
Scurrying about to find the next pleasure
Failing to realize the nature of
Our fragile, short existence
And imminent Death

      for more on this holiday observance, visit: http://www.mexconnect.com/mex_/feature/daydeadindex.html 


for more on All Saints’ Day, visit: http://www.newadvent.org/cathen/01315a.htm

Drone victims give US lawmakers first-hand account of attack

October 29, 2013 5:00PM ET
Rafiq Rehman and his two children testify about the day a drone murdered his mother;
five lawmakers attend

Nabila Rehman, left, 9, watches as her brother Zubair reads a statement about the day their grandmother was killed by a U.S. drone strike in Pakistan, at a hearing in Washington, Tuesday.

Jason Reed/Reuters

Nine-year-old Nabila Rehman rested her head on the table.

Nabila, a shy girl with startling hazel eyes and red streaks in her dark hair, along with her father Rafiq and 13-year-old brother Zubair have told the story of the day when a drone fell from the sky in their village in North Waziristan so many times that by Tuesday morning the tale was rote — even if this particular retelling was before U.S. lawmakers, at a briefing which was the first opportunity for members of Congress to hear directly from Pakistani victims of American drones.

It was Oct. 24, 2012, the day before the Islamic holy day of Eid-al-Adha in North Waziristan. Zubair, Nabila, their little sister, five-year-old Asma and some of their cousins were all in the fields beside their house as their grandmother, 67-year-old Momina Bibi, showed them how to tell when the okra was ripe for picking.

Zubair knew the drones were circling overhead; he has known their distinctive buzzing since he was even younger — a methodical zung, zung, zung, he says.

“It’s something that even a 2-year-old would know,” he said in Pashto, speaking to Al Jazeera through a translator. “We hear the noise 24 hours a day.”

Before the missile hit, he remembers hearing two clicks, like a trigger being pulled. Suddenly, day seemed to turn to night as they were enveloped in darkness and heat. Their grandmother, Momina Bibi, was thrown 20 feet away and killed instantly.

Zubair, Nabila and the other children wounded in the attack were taken to a hospital. Zubair had shrapnel lodged in his leg — an injury that would take expensive laser surgeries to heal — while Nabila looked down to see her hand bleeding.

“I tried to bandage my hand but the blood wouldn’t stop,” she said. “The blood kept coming.”

Momina Bibi’s wounds were so severe that neighbors would not allow her sons to see the body, said Rafiq, a primary schoolteacher in Pakistan who was in town buying school supplies and sweets when the attack happened.

In the days and weeks after, Rafiq said the newspapers reported that militants had been killed in the strike. As far as he knows, his mother was the sole fatality. He has never received an answer from the Pakistani or U.S. governments about why she was targeted or whether the strike was a mistake.

The Rehmans traveled halfway across the world, from their remote village of Tappi, to tell their story and to urge lawmakers to put an end to the covert CIA program of “targeted killings” in Pakistan, Yemen and elsewhere. They also participated in an Amnesty International report about casualties of drones and a documentary by filmmaker Robert Greenwald, called Unmanned. According to the London-based Bureau of Investigative Journalism, 376 total strikes have taken place in Pakistan, killing up to 926 civilians and as many as 200 children.

drone wars
Click for the latest news and analysis on drones.

Since they arrived in Washington last weekend — their first time outside of Pakistan — the Rehmans have patiently sat for hours of interviews with dozens of media outlets in a dogged effort to change hearts and minds, with only a few breaks to go see the sights in the U.S. capital.

The Obama administration, for its part, until recently did not even acknowledge the existence of the program. Now, officials say drone warfare is a precise and effective means to neutralize enemies in remote regions of the world where capturing terrorists is difficult and that civilian casualties are minimal.

That rationale holds little solace for Rafiq and his family.

Opponents of the United States have pointed out, beyond the legal and moral implications, that the U.S. policy engenders hatred of America and breeds extremism.

But even after what his family has been through, Rafiq Rehman said he does not resent the United States. In fact, even after witnessing his first Halloween weekend in the States, he does not believe all that much separates him from Americans.

“It’s very peaceful here. For the most part, there’s a lot of freedom and people get along with each other. They’re nice, they respect each other, and I appreciate that,” Rafiq told Al Jazeera.

“We’re all human beings,” he said. “I knew that Americans would have a heart, that they would be sympathetic to me. That’s why I came here — I thought if they heard my story, they would want to listen to me and influence their politicians.”

Rafiq, like so many fathers, wants his children to have peaceful lives and the best education possible. He hopes Zubair grows up to be a doctor and that Nabila is a lawyer.

“(The drone attack) created a disruption in our lives,” he said. “Our children live in fear. They don’t want to go to school. They don’t want to play outside.”

Ultimately, only five members of Congress arrived at the briefing to hear their testimony Tuesday morning: Rep. Alan Grayson of Florida, who organized the briefing, along with Reps. Jan Schakowsky, D-Ill., Rush Holt, D-N.J., John Conyers, D-Mich., and Rick Nolan, D-Minn.

What compelling interest did the U.S. government have in murdering a grandmother of nine and a midwife who helped deliver babies in the village, Rehman asked them. How can he reassure his children that the drones will not come back?

“I no longer love blue skies,” Zubair said. “In fact, I now prefer gray skies. The drones do not fly when the skies are gray.”

Grayson said the briefing, held a full decade after the first drone strikes in Yemen by the Bush administration, was a promising start and dismissed the seemingly low attendance, noting that five members showed “a fair amount of interest.” Grayson doubted, however, that a full committee hearing with members of Congress would be called anytime soon.

“The appropriate committees generally are staffed by people, if I may say this, who are friends of the military industrial complex, not even enemies, or even skeptics of it,” he said.

Still, Zubair Rehman remained hopeful.

“I hope I can return home with a message,” he said. “I hope I can tell my community that Americans listened.”

Everything You Think You Know About Panhandlers Is Wrong

By Scott Keyes on October 30, 2013 at 11:15 am


A new survey of panhandlers in downtown San Francisco dispels a number of myths that society propagates about homeless people.

Conventional wisdom is that those on the sidewalk asking for a dollar are lazy freeloaders who will use the money for alcohol or drugs. Some even think that beggars are living large off of handouts, such as Fox News’ John Stossel, who has bravely used his television perch to take on beggars. “I had heard that some people beg for a living and make big bucks — $80,000 a year in some cases,” Stossel told Fox & Friends. “You really shouldn’t give to these street people,” Stossel concluded. “You are really supporting alcoholism and drug problems.”

Researchers wanted to test out whether this widely held view of panhandlers as lazy alcoholics getting rich off others was correct. The Union Square Business Improvement District, a collection of 500 property owners in downtown San Francisco, hired GLS Research to survey panhandlers over a two-day period in March.

They found that, for the vast majority of beggars, Stossel’s view was simply not true.

In San Francisco’s Union Square, the typical panhandler is a disabled middle-aged single male who is a racial minority and makes less than $25 per day despite panhandling seven days a week for more than five years. Though Stossel was insistent that panhandlers just use the money for beer and pot, the majority of those surveyed did not. In fact, 94 percent used the meager funds they raised for food.

In addition, some justify doing little to fight homelessness because, in their view, many homeless people don’t want help and prefer living on the streets. However, researchers discovered that, on the contrary, just 3 percent of panhandlers don’t want housing.

Among the survey’s findings:

  • 83 percent are men
  • 48 percent are African American
  • 31 percent are white
  • 69 percent are single
  • 26 percent served in the military
  • 70 percent are 40 to 59 years old
  • 58 percent have been panhandling for at least five years
  • 53 percent panhandle seven days a week
  • 60 percent make $25 a day or less
  • 94 percent use the money for food
  • 44 percent use it for drugs or alcohol
  • 62 percent are disabled
  • 25 percent are alcoholics
  • 32 percent are addicted to drugs
  • 82 percent are homeless

In total, 146 people participated in the survey.

Researchers also spoke with 400 people who had given money to panhandlers in the past year. They found that the largest group of people who chose to give were young working-class Bay Area residents. Empathy was a main driver; three in five said the gave “because they or a family member may be in need someday.”

Ben Kingsley on “Ender’s Game” tattoos: “I was conscious of their special power, their significance”

Wednesday, Oct 30, 2013 09:45 AM PDT

The Oscar-winner on playing a Maori military leader and why special effects don’t mean doom for actors

By Daniel D’Addario

Sir Ben Kingsley is a Brit of Indian descent — but he has one of the most diverse filmographies of any star out there. He’s played an Indian national hero in “Gandhi,” European Jews in “Schindler’s List” and “Anne Frank,” a South Asian holy man in “The Love Guru,” a Persian military leader in “House of Sand and Fog.” But with his latest role, he’s transformed himself more than ever before.

The four-time Oscar nominee (he won the best actor trophy for “Gandhi”) plays Mazer Rackham, a mentor figure to the child hero of “Ender’s Game” (out Friday). Rackham is a genius military strategist in the battle against alien forces, and half-Maori; Kingsley dons extensive facial tattoos for the part. He explained the process of taking on another ethnicity on-screen to Salon — and he’s more concerned with the visceral process of acting his role as written than with details of why, for instance, Maori folks wear tattoos.

Rackham is just the latest twist in a career that’s seen Shakespearean productions, Marvel comic movies, and collaborations with Martin Scorsese; indeed, in Scorsese’s 3-D extravaganza “Hugo,” Kingsley played opposite “Ender’s Game” star Asa Butterfield. Uniquely for a Commander of the Most Excellent Order of the British Empire (he got the Honors in 2000), the actor’s refused to limit himself only to sure bets — indeed, he relishes the opportunity to be in front of a green screen. Kingsley explained just how an actor ought to perform opposite extensive special effects: “The tendency for me, and it’s seemed to work on-screen, is to underreact, rather than overreact.”

Hi, Sir Ben.

Hello! Where are you based?

I’m in New York.

Ah, I love that town! I just filmed something called “Learning to Drive” there; it was directed by Isabel Coixet, who directed me in “Elegy”! Good-o! Good, good! I finished it two weeks ago! It’s been a crazy wonderful year! I love filming in New York!

You played a person of Maori descent in “Ender’s Game.” Is there any particular responsibility that comes with donning tattoos associated with another culture, not your own?

Well, I know that Gavin Hood was somewhat apprehensive about asking me to wear Maori makeup. He thought I’d say I love this character, but I don’t want to wear tattoos. Well, he didn’t know me well. He did quite gently broach the subject, and he said he’d bring me together with Maori experts and show me videos of tattoos on Maori faces. I said, “Gavin! Hold it right there! I shall go into makeup, I shall apply tattoos, I shall wear them with pride onto your set.” It is that simple with me.

When I meet the costume department, who are designing costumes, they bring a ton of sketches, swaths of designs. I see four racks of clothes, and I say, “Whatever you want me to wear in this scene, hang it in my trailer and I shall wear it!” For me, it’s the telling of the story. Costumes are their department! You hang it in my trailer, and I shall put it on. I’ve learned to do this because it frees me to do what I love, which is the acting. What happened in makeup, since you’re curious about tattoos: Makeup artists would work very quietly for an hour and 10 minutes, an hour and 15, and I have my eyes closed. I gently run my lines, quietly meditate and go blank. When they say, “You’re ready,” I open my eyes, I see the extraordinary design they created, and I leave my trailer.

I do know and have great respect for tradition of Maori tattoos. They explain and display a lineage, a story, a past. When they have applied those wonderful tattoos to my face, I was conscious of their special power, their significance. But I was more conscious of how every actor on set looked at me differently. I didn’t need to research it. I just needed to put it on my face, and everyone looked at me in a curious, slightly cautious sense. Nobody just looked at me. They read me. And it shows on camera.

But beyond the tattoos, I’m curious about how diverse the characters you’ve played are ethnically, and how you manage to take on these roles without ever delving into caricature. Take, for instance, your role as an Iranian in “House of Sand and Fog,” for which you were nominated for an Oscar.

Well, I worked with Shohreh, who was a wonderful Iranian actress. And I got very acquainted with an Iranian family. By osmosis, I surrounded myself with the real thing rather than listen to a dialect coach. Fortunately, I was surrounded by them on the film, and worked by trust and a flow of energy and their generosity regarding their own wonderful culture.

You’ve appeared in a number of sci-fi or action films, among them “Iron Man 3,” “Thunderbirds,” “A Sound of Thunder,” and “Prince of Persia” — do you find this sort of role escapist?

I think that “Ender’s Game” and “Iron Man 3,” they gave me an opportunity to join in with a great team, understand my function in that team, and be useful. As soon as the director says action, there’s no discernible difference between my methodology in any of my films. It’s very, very exciting how actors’ body chemistry changes when the director says action!

Harrison Ford described it as a fighter pilot taking off in a jet. That’s the adrenaline actors feel when they walk onstage or hear “action” on a film set. I’ve heard it described as a 60-mile-an-hour car crash, but let’s go with the fighter pilot.

Some actors are worried that green-screen technology is making the role an actor plays on a film set progressively less important.

I think it’s changing. They’re starting to choose elements that perhaps come from different genres. More narrative-driven, character-driven. These actors are now brought into science-fiction and fantasy films in order to keep the story intact. “Ender’s Game” is character-driven. Yes, the effects are stunning. At the heart of it is a character-driven narrative. Every character has his place inside the narrative. I don’t find that at all diminishing or distorting of my craft. I was able to really enjoy my craft as an actor.

What do you think kids who know you from “Ender’s Game” or “Hugo” will think when they discover your more savage turns in “Sexy Beast” or “Bugsy”?

I hope you’re underestimating people’s grasp of what actors do for a living. I’ve done so many roles, and do so many, and will do so many. It’s very hard not to see I’m an actor. You have to really pick your films very carefully and stick your fingers in your ears. “He’s different every time!” Well, that’s the job.

What I’m more curious about is whether you prefer heroic or villainous roles, having played both. Which are you more inclined to seek out or take?

My first 15 years as an actor was basically under amazing care and provocation of the Royal Shakespeare Company. Very often I’d be doing three or four plays in repertoire. The breadth of that man’s writing, the depth of it, gave me appetite for musically written characters that have a definite, clear narrative and function. It’s knowing, and being empowered by knowing, why I am in the film that I find thrilling. What he is, I don’t care. As long as he’s in the dance.

But you don’t think green screens detract from that dance?

Gavin Hood, he’ll tell you himself, he did expend a huge amount of energy on the set describing to us exactly what we were seeing on the screen, and what the audience would see on the screen. Because he put so much energy into the exercise, we were allowed not to overact. The danger working with a green screen: One tends to concentrate on what is not there. We tend to manufacture reactions because there’s nothing to react to. Paradoxically, the green screen demands a smaller reaction. The tendency for me, and it’s seemed to work on-screen, is to underreact, rather than overreact. We allow the audience to react.

Would you do a sequel?

I don’t know, really. I don’t think of it like that. I’ve done him now, if I’m asked to do him again … I tend more to move on. I’ve done that now. Why do it nine more times?

Daniel D’Addario is a staff reporter for Salon’s entertainment section. Follow him on Twitter @DPD_

The Fantasy and Folklore of All Hallows



Halloween had its beginnings in an ancient, pre-Christian Celtic festival of the dead. The Celtic peoples, who were once found all over Europe, divided the year by four major holidays. According to their calendar, the year began on a day corresponding to November 1st on our present calendar. The date marked the beginning of winter. Since they were pastoral people, it was a time when cattle and sheep had to be moved to closer pastures and all livestock had to be secured for the winter months. Crops were harvested and stored. The date marked both an ending and a beginning in an eternal cycle.

The festival observed at this time was called Samhain (pronounced Sah-ween). It was the biggest and most significant holiday of the Celtic year. The Celts believed that at the time of Samhain, more so than any other time of the year, the ghosts of the dead were able to mingle with the living, because at Samhain the souls of those who had died during the year traveled into the otherworld. People gathered to sacrifice animals, fruits, and vegetables. They also lit bonfires in honor of the dead, to aid them on their journey, and to keep them away from the living. On that day all manner of beings were abroad: ghosts, fairies, and demons–all part of the dark and dread.

Samhain became the Halloween we are familiar with when Christian missionaries attempted to change the religious practices of the Celtic people. In the early centuries of the first millennium A.D., before missionaries such as St. Patrick and St. Columcille converted them to Christianity, the Celts practiced an elaborate religion through their priestly caste, the Druids, who were priests, poets, scientists and scholars all at once. As religious leaders, ritual specialists, and bearers of learning, the Druids were not unlike the very missionaries and monks who were to Christianize their people and brand them evil devil worshippers.

As a result of their efforts to wipe out “pagan” holidays, such as Samhain, the Christians succeeded in effecting major transformations in it. In 601 A.D. Pope Gregory the First issued a now famous edict to his missionaries concerning the native beliefs and customs of the peoples he hoped to convert. Rather than try to obliterate native peoples’ customs and beliefs, the pope instructed his missionaries to use them: if a group of people worshipped a tree, rather than cut it down, he advised them to consecrate it to Christ and allow its continued worship.

In terms of spreading Christianity, this was a brilliant concept and it became a basic approach used in Catholic missionary work. Church holy days were purposely set to coincide with native holy days. Christmas, for instance, was assigned the arbitrary date of December 25th because it corresponded with the mid-winter celebration of many peoples. Likewise, St. John’s Day was set on the summer solstice.

Samhain, with its emphasis on the supernatural, was decidedly pagan. While missionaries identified their holy days with those observed by the Celts, they branded the earlier religion’s supernatural deities as evil, and associated them with the devil. As representatives of the rival religion, Druids were considered evil worshippers of devilish or demonic gods and spirits. The Celtic underworld inevitably became identified with the Christian Hell.

The effects of this policy were to diminish but not totally eradicate the beliefs in the traditional gods. Celtic belief in supernatural creatures persisted, while the church made deliberate attempts to define them as being not merely dangerous, but malicious. Followers of the old religion went into hiding and were branded as witches.

The Christian feast of All Saints was assigned to November 1st. The day honored every Christian saint, especially those that did not otherwise have a special day devoted to them. This feast day was meant to substitute for Samhain, to draw the devotion of the Celtic peoples, and, finally, to replace it forever. That did not happen, but the traditional Celtic deities diminished in status, becoming fairies or leprechauns of more recent traditions.

The old beliefs associated with Samhain never died out entirely. The powerful symbolism of the traveling dead was too strong, and perhaps too basic to the human psyche, to be satisfied with the new, more abstract Catholic feast honoring saints. Recognizing that something that would subsume the original energy of Samhain was necessary, the church tried again to supplant it with a Christian feast day in the 9th century. This time it established November 2nd as All Souls Day–a day when the living prayed for the souls of all the dead. But, once again, the practice of retaining traditional customs while attempting to redefine them had a sustaining effect: the traditional beliefs and customs lived on, in new guises.

All Saints Day, otherwise known as All Hallows (hallowed means sanctified or holy), continued the ancient Celtic traditions. The evening prior to the day was the time of the most intense activity, both human and supernatural. People continued to celebrate All Hallows Eve as a time of the wandering dead, but the supernatural beings were now thought to be evil. The folk continued to propitiate those spirits (and their masked impersonators) by setting out gifts of food and drink. Subsequently, All Hallows Eve became Hallow Evening, which became Hallowe’en–an ancient Celtic, pre-Christian New Year’s Day in contemporary dress.

Many supernatural creatures became associated with All Hallows. In Ireland fairies were numbered among the legendary creatures who roamed on Halloween. An old folk ballad called “Allison Gross” tells the story of how the fairy queen saved a man from a witch’s spell on Halloween.

O Allison Gross, that lives in yon tower
the ugliest witch int he North Country…
She’s turned me into an ugly worm
and gard me toddle around a tree…

But as it fell out last Hallow even
When the seely [fairy] court was riding by,
the Queen lighted down on a gowany bank
Not far from the tree where I wont to lie…
She’s change me again to my own proper shape
And I no more toddle about the tree.

In old England cakes were made for the wandering souls, and people went “a’ soulin'” for these “soul cakes.” Halloween, a time of magic, also became a day of divination, with a host of magical beliefs: for instance, if persons hold a mirror on Halloween and walk backwards down the stairs to the basement, the face that appears in the mirror will be their next lover.

Virtually all present Halloween traditions can be traced to the ancient Celtic day of the dead. Halloween is a holiday of many mysterious customs, but each one has a history, or at least a story behind it. The wearing of costumes, for instance, and roaming from door to door demanding treats can be traced to the Celtic period and the first few centuries of the Christian era, when it was thought that the souls of the dead were out and around, along with fairies, witches, and demons. Offerings of food and drink were left out to placate them. As the centuries wore on, people began dressing like these dreadful creatures, performing antics in exchange for food and drink. This practice is called mumming, from which the practice of trick-or-treating evolved. To this day, witches, ghosts, and skeleton figures of the dead are among the favorite disguises. Halloween also retains some features that harken back to the original harvest holiday of Samhain, such as the customs of bobbing for apples and carving vegetables, as well as the fruits, nuts, and spices cider associated with the day.

Today Halloween is becoming once again an adult holiday or masquerade, like Mardi Gras. Men and women in every disguise imaginable are taking to the streets of big American cities and parading past  carved, candlelit jack o’lanterns, re- enacting customs with a lengthy pedigree. Their masked antics challenge, mock, tease, and appease the dread forces of the night, of the soul, and of the otherworld that becomes our world on this night of reversible possibilities, inverted roles, and transcendency. In so doing, they are reaffirming death and its place as a part of life in an exhilarating celebration of a holy and magic evening.

CryptoLocker Is The Nastiest Malware Ever & Here’s What You Can Do

CryptoLocker Is The Nastiest Malware Ever & Here’s What You Can Do

Ransomware is an especially odious type of malware. The way it works is simple. Your computer will be infected with some malicious software. That software then renders your computer entirely unusable, sometimes purporting to be from local law enforcement and accusing you of committing a computer crime or viewing explicit pictures of children. It then demands monetary payment, either in the form of a ransom or a ‘fine’ before access to your computer is returned.

Horrible, isn’t it? Well, get ready to meet CryptoLocker; the evil patriarch of the Ransomware family.

What Is CryptoLocker

CryptoLocker is a piece of malware targeting computers running the Microsoft Windows operating system. It is typically spread as an email attachment, often purporting to be from a legitimate source (including Intuit and Companies House). Some say it is also being spread through the ZeuS botnet.

Once installed on your computer, it systematically encrypts all documents that are stored on your local computer, as well as ones that are stored on mapped network drives and mounted removable storage.


The encryption used is strong, 2048 bit RSA, with the decryption key for your files being stored on a remote server. The odds of you being able to break this encryption is almost nonexistent. If you want to get your files back, CryptoLocker asks for you to fork over some cash; either two bitcoins (At the time of writing, worth almost USD $380) or $300 in either MonkeyPak or Ukash prepaid cards. If you don’t pay within three days, the decryption key is deleted and you lose access to your files forever.

I spoke to popular security expert and blogger Javvad Malik; this is what he had to say about CryptoLocker.

Ransomware such as CryptoLocker is not something very new – variations of Ransomware have been around for years. When you look at CryptoLocker, it predominantly comes in via phishing emails (from what I’ve seen). The best way to protect against it is for users to be vigilant against clicking on links within emails. Currently, it looks like there’s not much that can be done once infected and I wouldn’t advice anyone to pay the ransom. It goes back to having backups and data management in place.

Mitigating Against It

Reports suggest that some security programs have had a hard time of preventing CryptoLocker from getting its claws onto your system before it’s too late. Fortunately, American security expert Nick Shaw has created a handy piece of software called CryptoPrevent (free) . This applies a number of settings to your installation of Windows that prevents CryptoLocker from ever executing and has been proven to work in Windows XP and Windows 7 environments.


Noam Chomsky: America’s infrastructure is broken

The linguist and activist on a bevy of topics ranging from the U.S.-Mexican border to the mortgage crisis

Noam Chomsky: America's infrastructure is broken
Noam Chomsky (Credit: Reuters/Jorge Dan)
This article in its present form originally appeared on AlterNet.

AlterNet In order to understand the rationale behind the fortification of the border and the physical form it has taken in recent years, it is necessary to go back a little first. The US-Mexican border, like most borders, was established by violence – and its architecture is the architecture of violence. The US basically invaded Mexico in a pretty brutal war back in the 1840s. The war was described by President-General Ulysses S. Grant, as “the most wicked war in history”. [9*] That may be an exaggeration, but it was a pretty wicked war. It was based on deeply racist ideas. First of all, it started with the annexation of Texas, which was called the re-annexation of Texas on the grounds that it was “really ours all along” […], that they stole it from us, and now we have to re-annex it. That took Texas away from Mexico. The rest of the war, and the later historical period, basically involved additional land grabs.

In order to understand it, you should read the progressive writers like Walt Whitman, Ralph Waldo Emerson, and others. The position was, as Whitman put it eloquently, that “backward Mexico had to be annexed as part of bringing civilization to the world”—which the US was seen as leading. [10]Emerson said it in more flowery language along the lines of, “it really doesn’t matter by what means Mexico is taken, as it contributes to the mission of ‘civilizing the world’ and, in the long run, it will be forgotten”. [11] Of course, that’s why we have names like San Francisco, San Diego, and Santa Fe all over the southwest and the west of the United States. We should really call it Occupied Mexico.

Like many borders around the world, it is artificially imposed and, like those many other borders imposed by external powers, it bears no relationship to the interests or the concerns of the people of the country—and it has a history of horrible conflict and strife. Take the border between Afghanistan and Pakistan, for example. The British imposed the borderline. They partitioned the overall area nearly in half and arbitrarily divided the land. No Afghan government has ever accepted it, and nor should they. This has happened all across Africa as well, of course, and so the Mexican border is no exception.

After the war of the 1840s the US-Mexican border remained fairly open. Basically the same people lived on the same sides of it, so people would cross to visit relatives or to engage in commerce, or something else. [12]  It was pretty much an open border until the early 1990’s. In 1994, the Clinton administration initiated the program of militarizing the border, and that was extended greatly under George W. Bush in the 2000s—largely under the guise of safety and defence from terrorism.[13]The two key pieces of legislation were called “The Border Protection, Anti-terrorism, and Illegal Immigration Control Act of 2005” and the “Secure Fence Act of 2006″. [14]  That was interesting, and revealing, because the warnings from the security services were that the dangerous border, with regard the possible incursion of terrorists into the US, was the Canadian border. If you take a look, you can see why. The Canadian border is so porous that you and I can cross it in some forested areas. If you were worried about terrorism, you would fortify the Canadian border. Instead, they fortified the Mexican border where there is no threat of terrorism; it was, clearly, for other reasons. [15]

Clinton’s militarization of the border in 1994 coincided with the passing—I should say the “imposition”—of the executive version of NAFTA, since it was not supported by the public.[16] In fact, the details of NAFTA weren’t even known by the public. [17] The labor movement, which is by law supposed to be consulted on trade-related issues, was barely notified until the last minute; and their recommendations were disregarded along with the recommendations of Congress’ own research bureau. The Office of Technology Assessment called for some form of free trade agreement, but one that was quite differently constructed to the final version of NAFTA.

It was clear that the final version of NAFTA, which is not a free trade agreement at all, would lead to the substantial destruction of small and medium scale American-Mexican agriculture.[18] Mexicancampesinos can be efficient, but they can’t possibly compete with highly subsidized US agricultural business. Mexican businesses were forced to compete on level terms with the US multinationals, which, in addition, had to be given what’s called National Treatment in Mexico.[19] The investment conditions were set up so that US firms would be able to invest in Mexico, exploit cheap labor and the weak labor and environmental constraints there. It was also inevitably and deliberately meant to undermine smaller scale American agricultural businesses and workers, which is exactly what happened.

In general, it was assumed that there would be a flow of people fleeing from Mexico across the border as either a direct, or indirect, result. It had to be militarized and protected. The defense infrastructure that crosses swathes of US land now, was not coincidental. It was tied up with all these issues. We don’t have internal documents from that period, so we can’t know for sure whether the militarization of the border was directly based on the expectation of an increase in economic refugees, but it seems a pretty plausible surmise.[20]

Incidentally, it’s not just to prevent Mexicans fleeing the ravages of US economic policy, but also refugees from other parts of south and Central America forced out of their countries by other policies. In early May this year, one of the dictators of Guatemala, Rios Montt, was given a heavy sentence for his role in the virtual genocide of indigenous Guatemalans living in the highlands—actions that were strongly supported by Ronald Reagan in the 1980s. Across the United States, generally, there are many people who fled the Guatemalan highlands as a result of the atrocities carried out in the early 1980s. [21] In fact, many live right where I do, near Boston.

Border crossings themselves are the acts of desperate people. You have to go miles through the desert with no water. It’s long treks in the heat during the day and freezing cold at night—and there are armed militias roaming around trying to hunt people down. I know personally a Guatemalan-Mayan woman who crossed the border half a dozen times while pregnant. Finally, she made it on the seventh try. I think she was seven or eight months pregnant and was rescued by solidarity workers who brought her to Boston. There are plenty of other cases like that—terrible cases. Families that are torn apart. Basically, these people don’t want to be here. They want to be back home, but conditions there have been made so awful that they can’t survive.  They are torn from their families, they can’t see their children; they can’t see their grandparents. They live and die apart. It’s a terrible situation. [22]

It’s interesting however, that to some extent recently, there has been a slight opening of the border in the San Diego-Tijuana area to allow for commercial and cultural contact. It does not break the border, but it does bend it a little. My own feeling is that what ought to happen, over most of the world—since these borders are in large measure unofficial and imposed by force—is that a process of the border erosion should be begun; attempts to allow for everyday cultural contact that could, in the longer term, lead to some form of integration. However, at the moment, the built forms you see in the US border states, that militarized architecture developed over years, seems likely to stay for a while. Certainly our understanding of it cannot be divorced from the social and political context surrounding it. It is clearly political architecture—maybe even a symbol[23] —built to send a message to both the Mexican and, importantly, the American public.[24]


Next section: Chomsky on how America’s economic model created the suburbs

Introduction: In drawing out this background to the physical infrastructure across the US-Mexican border, Chomsky expands on ideas hinted at in some of his most recent works—principally references found in Making the Future – Occupations, Interventions, Empire and Resistance, 2012 and Occupy, also from 2012. In discussing the question of US suburbanization in the second half of the Twentieth Century, he does something similar—develops isolated thoughts found elsewhere in his writings into more fully fleshed out arguments here. In Powers and Prospects for example, one finds the reference he develops in this interview to suburbia as a “social engineering project”. Similarly, his comments here on the ‘interventionist’ underbelly of successive, supposedly free-market, US governments, echo ideas explained in Understanding Power, Occupy, and a number of other texts.

However, in shifting attention from the clearly ‘oppressive’ architecture of a ‘separation barrier’, to the ‘desirable’ and much sought after ‘suburban dream house’, his thought shifts significantly in register. The politics and issues that underlie this civil, and apparently market led, architecture reveal, for Chomsky, a contradiction at the heart of US rhetoric on free trade. According to Chomsky, US governments have always wanted a very powerful state that intervenes massively in the economy. The key difference to the standard reading of the interventionist state, however, is that in the case of the US, it was intervention for the benefit of the wealthy.[25]

He argues that this interventionist model was, in fact, the one upon which the country was founded. He also suggests that “the U.S. pioneered that model of development” and furthermore, that Alexander Hamilton invented the concept of “infant industry protection and modern protectionism”.[26] Not only is that why, he argues, the US is a rich and powerful country today, it is the reason why the country’s residential infrastructure has developed in the way it has. It is what lies behind the suburban dream.

Chomsky: The social and physical construction of suburban America really was quite complex. It was a very elaborate system, and clearly a massive social engineering project that has changed US society enormously. [27]  Incidentally, I don’t have a personal objection to suburbs, in fact I live in one, but suburbanization is a different question. [28]  It starts back in the 1940s with a literal conspiracy. I mean a conspiracy that went to court. The conspirators got a minor pat on the wrist however.

They were General Motors, Standard Oil of California and, I think, Firestone Rubber. The origins of suburbia reveal an attempt to take over a fairly efficient mass-transportation system in parts of California—the electric railways in Los Angeles and the like—and destroy them so as to shift energy use to fossil fuels and increase consumer demand for rubber, automobiles and trucks and so on. [29] It was a literal conspiracy. It went to court. The courts fined the corporations $5000, or something like that, probably equivalent to the cost of their victory dinner.[30]

But what happened in California started a process that then expanded—and in many ways. It included the interstate highway system. That was presented as part of the defense against the Russians. It was launched under the Interstate Defense Highway Act of 1956, and was intended to facilitate the movement of people and goods, troops and arms, and, allegedly, to prevent overpopulation in specific areas that could become the focus of nuclear attack. [31] The slogan of defense is the standard way of inducing the taxpayer to pay the cost of the next stage of the hi-tech economy of course.[32] That’s true whether it be computers, the Internet or, as in this case, a car-based transportation system.[33]

From the late 1940s, into and through the 50s, there developed a complex interaction between federal government, state and local government, real-estate interests, commercial interests and court decisions, which had the effect of undermining the mass transit system across the country. It was pretty efficient in certain areas. If you go back a century ago for example, it was possible to travel all around New England on electric railways. The first chapter of E. L. Doctorow’s Ragtimedocuments it.[34] Subsequently, we saw the elimination of the mass transport system in favor of fossil fuel use, automobiles, roads and airplanes, which are also an offshoot of federal government.

Today, we have private airline companies, but if you take a look at a Boeing plane next time you travel, you’ll see that you are basically taking a ride on a modified bomber. A lot of the technology, and the research that goes into the development of apparently independently funded and non-government projects in our economy, comes directly from, or has its origins in, federal government. The Reagan Administration, for example, was committed to an enormous increase in state investment through the ‘Pentagon system’—diverting public finance into hi-tech industries and a state-guaranteed market—largely through arms production.  It is essentially public subsidy for private profit—and they call it “free enterprise”. That can only be done by inciting fear in the minds of the public.[35]

The military has, to a large extent, always fulfilled this role of course. It has been used repeatedly as a site for technological innovation. The US is a perfect example.[36] If you revisit the roots of the aviation industry, it’s a clear case. You can read it in Fortune Magazine and other business journals of the time. It was understood in the 1940s that the airline industry—the private airline industry—could not have developed, and today cannot survive, without extensive federal government subsidy. It was stated perfectly openly, and was well understood. It’s the same today. The airports are government built—and so on and so on.

The whole infrastructure of air travel was, and is, part of government policy. It is not a natural development of a free economic system—at least not in the way that is claimed. The same is true of the roads of course. It is simply not true that suburbia is a product of the market, or market forces, or people’s ‘uninfluenced’ desires. It is the result of a deliberate social engineering program—led from the center. It is totally political in that sense. It’s often presented as a product of the market—and in that regard, it’s a standard argument that tries to draw upon the writings of Adam Smith to give it some sort of justification.

But this use of Smith to justify free market economics is just another distortion. Adam Smith would have hated the capitalism we see today. Smith is explicit about it. He was not in favor of free, unbridled, markets. Today he would be called a libertarian socialist.[37] He understood, and stated it clearly in The Wealth of Nations. He argues that England could be “saved” from a form of neoliberal globalization by an “invisible hand”.[38] There needs to be control—or intervention. Daniel Defoe, argued something pretty similar in the eighteenth century.

Defoe identified that British industry wouldn’t be able to survive in the face of ‘genuine’ productive competition from Chin, India, and other Eastern countries. Britain had the highest real wages in the world and, at the time, the best organized working class—at least that’s what much recent research suggests. As Defoe argued, in that context, Britain would have been deindustrialized by the cheap costs of Indian production if protectionist policies hadn’t been employed.[39] From that, you can see how this use of Smith to ‘justify’ the market religion is actually false; and there are numerous other, more recent examples, to underline that.[40]

Thomas Jefferson picked up many of the same themes. [41]  Like Smith, he saw the potential destruction the free market could bring. It was foreseeable. In the case we’re talking about here, the same is true. The devastating effects of exclusively profit focused thinking that the development of suburbia represents were foreseeable—and foreseen. Obviously, the interstate highway program and the destruction of public transport were prerequisites for it, but they served more than just limited interests of oil producers and car manufacturers, although they were central to it. It contributed, and was intended to contribute, to the artificial manufacture of other markets. These attempts to scatter the population into suburban areas across the country led to the emergence of shopping malls, for example. It also led to the breaking down of inner cities and so on. It was also accompanied by “white flight” of course. [42] Additionally, racial segregation was one of the other consequences, at least at first. [43]

That was all part of what we can quite literally call, a massive social engineering project – of a very complex sort.[44] While there are some attractive elements to suburban living, as I said I live in a suburb myself by choice, it has left us with a society, and a physical infrastructure, that is unviable. Just take the Boston area where I live. It takes me forty-five minutes to one hour to drive to work because of traffic jams and detours and so forth. If there was a subway, it would take me ten minutes. But our system is designed so that you don’t have the choice of efficient, humanly beneficial transportation—and Boston is only one example. None of this is ‘natural’ in any way. It didn’t emerge spontaneously—a magical product of the market. It was engineered for a specific range of interests.

Next Section: Chomsky on Mortgage Crisis

Introduction: In contrast to the construction boom that pushed suburban sprawl to even greater extremes in the past two decades, the most recent ‘development’ to really mark the suburban landscape has been quite different—the subprime crisis. Leading to foreclosures on thousands of mortgages, and consequent repossessions and empty properties across the country, it represented the conversion of ‘the dream’ into a nightmare for many.  In exploring the context in which suburbia was once more promoted, and has momentarily declined, Chomsky identifies the culpability of a ‘corrupted’ and ‘blinded’ banking system. However, he is also asked to consider the interconnection of interests that link the Clinton and Bush administrations to the construction sector, and which facilitated the ‘turning of a blind eye’ to the artificial manufacture of demand in the years prior to 2008.

With particular regard the fomenting of demand for houses at an artificially inflated price[45]—through unrealistically accessible mortgages—he is scathing of the banking and economic industries. However, his perspective goes deeper than the immediate actions of recent economists and financial executives. He argues that the logic and principles used to justify the liberalized operations of the market are, in themselves, myths. In returning to his interpretation of Adam Smith, he again suggests that they are principles based on a misunderstanding, or deliberate misinterpretation, of this historical doyen of the ‘free-marketeers’.

Chomsky: The subprime fraud can be seen as the latest stage of the processes we were discussing earlier. I can see that. It also involved an ever more complex and intricate set of interests—the banks, government, the building industry, and real-estate interests once again. Those interests have been at play since the mid-twentieth century with regard the development and exploitation of the land, and the need to house people in the United States. It is true that it wasn’t solely the banking sector—but they are the prime criminals.[46] What they were doing verges, and maybe crosses over, into literal criminal activity. [47]

The chicanery of mortgage selling should be seen as a crime I think. Tricking people into taking mortgages they can’t afford and so on, driving the prices very high—artificially high—why isn’t that considered a crime? Although the banks were the leaders in this, I suppose the economics profession in general deserves a good part of the blame here too. They simply refused to see the huge bubble that was developing. For about a hundred years house prices had pretty much tracked GDP – they sort of reflected the growth of the economy. Then, all of a sudden, they started shooting up. There was no economic basis for it.[48]

It should have been obvious. It was obvious. But the economics profession is caught up in a religion of market efficiency—ideas of rational expectation and so on. That ‘religion’ dictated that what was happening had to be right because the market was doing it. That pseudo-religious belief in the market meant that they simply didn’t see it. Here again, we come back to that distorted reading of Adam Smith. There were a few people who did see it all developing of course—Dean Baker, and a couple of others.[49] However, the profession predominantly, didn’t see it—or refused to see it, maybe. It seemed that they were enraptured by their form of religious fanaticism—but perhaps that is too sympathetic a reading of their motives.

The Federal Reserve Bank releases its transcripts after a five-year period, and the most recent ones released were those of 2007. They’re worth reading. Here are some of the most prestigious economists in the world, bankers and so on, discussing the economy. The economy was about to collapse around them. It was just at the point when the housing bubble was about to burst—when trillions of dollars of fake money was about to be lost with devastating effects for thousands of working families across the country. You read the transcripts, and they didn’t even see it. The grip of the religion was so strong that they couldn’t see what was in front of their eyes. They were programmed to see something else—the effectiveness of the market.

Primarily the responsibility is with the banks but there was federal government support, there was state government support, and a whole range of other interests were in play as well. You’re right in pointing out Clinton, and then again Bush. [50] Both administrations pushed the housing market and, inevitably, contributed to the explosion of urban sprawl that continued to spread across the country. But, if you look at the detail, it was principally a banking crisis. The banks were responsible for the most obvious and literal ‘criminal’ activity, as they were in Ireland and Spain and a number of other places. It verged on criminal behavior, undoubtedly. Incidentally, those responsible are bigger, richer, stronger than before—thanks to government bailouts—which was another scandal. [51]

The effects on the ground were clearly visible throughout that period—growing suburbs, growing sprawl etc. From the 90s and later on, it was perfectly visible in terms of urban, suburban, and rural land developments, but it was also seen in prices. House prices were going through the roof—far higher than anything based on economic essentials would dictate—but there was that blindness, a kind of euphoria. It was evident in the economics profession, the media, politicians, and others, etc. They were all hailing this as an enormous achievement. It was called “the great moderation” and Alan Greenspan, the Federal Reserve Chair, who was manipulating it all from the top, was hailed as one of the greatest economists of all time.[52] St. Alan he was called. For sure it was visible—but praised. [53]

You can see it on the ground where I live. My wife and I bought our house for $40,000 many years ago. Maybe today that would be $100,000, which is not exorbitant by US standards. It’s the only house on the street that has not either been torn down and replaced by a new, bigger building, or substantially expanded. When they were torn down during that recent period, what went up in their place was a mansion—a building that would that sell for millions of dollars. There was rampant speculation. Homes became an investment, very obviously.

It all added more energy to segregation on the grounds of wealth. The poor are driven out of whole areas when this takes place.  All that was just as visible as new suburbs, towns, sprawl etc. Again, of course, as you indicated earlier, it’s an example of your field, architecture, operating as something integrated into a bigger complex of forces. In this case it’s property speculation and an economic system exploiting laws and people’s aspirations.

All of this was happening when this country faced a tremendous infrastructure collapse, which is still very serious. US infrastructure is in a terrible condition. It’s not just evident on our inner cities, where housing for the poor is still often in bad condition, but on our roads, bridges and so on. Driving to work this morning I got caught up in detours of rebuilding that is, in some ways, essential. At least it is essential to the continuation of the current inefficient and failing transport model. It is necessary to reconsider the infrastructure of this country—the way it is set up and financed. It’s not really a question of architecture in the first instance; it is a question of politics and economics of course.


Noam Chomsky is Institute Professor (retired) at MIT. He is the author of many books and articles on international affairs and social-political issues, and a long-time participant in activist movements.

Naomi Klein: Why Science Is Telling All of Us to Revolt and Change Our Lives Before We Destroy the Planet



Climate scientists are coming to some incendiary conclusions.

People scream outside the United Nation’s Intergovernmental Panel

on Climate Change in Stockholm to demand immediate political action on the climate on September 27, 2013
Photo Credit: AFP

In December 2012, a pink-haired complex systems researcher named Brad Werner made his way through the throng of 24,000 earth and space scientists at the Fall Meeting of the American Geophysical Union, held annually in San Francisco. This year’s conference had some big-name participants, from Ed Stone of Nasa’s Voyager project, explaining a new milestone on the path to interstellar space, to the film-maker James Cameron, discussing his adventures in deep-sea submersibles.

But it was Werner’s own session that was attracting much of the buzz. It was titled “Is Earth F**ked?” (full title: “Is Earth F**ked? Dynamical Futility of Global Environmental Management and Possibilities for Sustainability via Direct Action Activism”).

Standing at the front of the conference room, the geophysicist from the University of California, San Diego walked the crowd through the advanced computer model he was using to answer that question. He talked about system boundaries, perturbations, dissipation, attractors, bifurcations and a whole bunch of other stuff largely incomprehensible to those of us uninitiated in complex systems theory. But the bottom line was clear enough: global capitalism has made the depletion of resources so rapid, convenient and barrier-free that “earth-human systems” are becoming dangerously unstable in response. When pressed by a journalist for a clear answer on the “are we f**ked” question, Werner set the jargon aside and replied, “More or less.”

There was one dynamic in the model, however, that offered some hope. Werner termed it “resistance” – movements of “people or groups of people” who “adopt a certain set of dynamics that does not fit within the capitalist culture”. According to the abstract for his presentation, this includes “environmental direct action, resistance taken from outside the dominant culture, as in protests, blockades and sabotage by indigenous peoples, workers, anarchists and other activist groups”.

Serious scientific gatherings don’t usually feature calls for mass political resistance, much less direct action and sabotage. But then again, Werner wasn’t exactly calling for those things. He was merely observing that mass uprisings of people – along the lines of the abolition movement, the civil rights movement or Occupy Wall Street – represent the likeliest source of “friction” to slow down an economic machine that is careening out of control. We know that past social movements have “had tremendous influence on . . . how the dominant culture evolved”, he pointed out. So it stands to reason that, “if we’re thinking about the future of the earth, and the future of our coupling to the environment, we have to include resistance as part of that dynamics”. And that, Werner argued, is not a matter of opinion, but “really a geophysics problem”.

Plenty of scientists have been moved by their research findings to take action in the streets. Physicists, astronomers, medical doctors and biologists have been at the forefront of movements against nuclear weapons, nuclear power, war, chemical contamination and creationism. And in November 2012,Nature published a commentary by the financier and environmental philanthropist Jeremy Grantham urging scientists to join this tradition and “be arrested if necessary”, because climate change “is not only the crisis of your lives – it is also the crisis of our species’ existence”.

Some scientists need no convincing. The godfather of modern climate science, James Hansen, is a formidable activist, having been arrested some half-dozen times for resisting mountain-top removal coal mining and tar sands pipelines (he even left his job at Nasa this year in part to have more time for campaigning). Two years ago, when I was arrested outside the White House at a mass action against the Keystone XL tar sands pipeline, one of the 166 people in cuffs that day was a glaciologist named Jason Box, a world-renowned expert on Greenland’s melting ice sheet.

 “I couldn’t maintain my self-respect if I didn’t go,” Box said at the time, adding that “just voting doesn’t seem to be enough in this case. I need to be a citizen also.”

This is laudable, but what Werner is doing with his modelling is different. He isn’t saying that his research drove him to take action to stop a particular policy; he is saying that his research shows that our entire economic paradigm is a threat to ecological stability. And indeed that challenging this economic paradigm – through mass-movement counter-pressure – is humanity’s best shot at avoiding catastrophe.

That’s heavy stuff. But he’s not alone. Werner is part of a small but increasingly influential group of scientists whose research into the destabilisation of natural systems – particularly the climate system – is leading them to similarly transformative, even revolutionary, conclusions. And for any closet revolutionary who has ever dreamed of overthrowing the present economic order in favour of one a little less likely to cause Italian pensioners to hang themselves in their homes, this work should be of particular interest. Because it makes the ditching of that cruel system in favour of something new (and perhaps, with lots of work, better) no longer a matter of mere ideological preference but rather one of species-wide existential necessity.

Leading the pack of these new scientific revolutionaries is one of Britain’s top climate experts, Kevin Anderson, the deputy director of the Tyndall Centre for Climate Change Research, which has quickly established itself as one of the UK’s premier climate research institutions. Addressing everyone from the Department for International Development to Manchester City Council, Anderson has spent more than a decade patiently translating the implications of the latest climate science to politicians, economists and campaigners. In clear and understandable language, he lays out a rigorous road map for emissions reduction, one that provides a decent shot at keeping global temperature rise below 2° Celsius, a target that most governments have determined would stave off catastrophe.

But in recent years Anderson’s papers and slide shows have become more alarming. Under titles such as “Climate Change: Going Beyond Dangerous . . . Brutal Numbers and Tenuous Hope”, he points out that the chances of staying within anything like safe temperature levels are diminishing fast.

With his colleague Alice Bows, a climate mitigation expert at the Tyndall Centre, Anderson points out that we have lost so much time to political stalling and weak climate policies – all while global consumption (and emissions) ballooned – that we are now facing cuts so drastic that they challenge the fundamental logic of prioritising GDP growth above all else.

Anderson and Bows inform us that the often-cited long-term mitigation target – an 80 per cent emissions cut below 1990 levels by 2050 – has been selected purely for reasons of political expediency and has “no scientific basis”. That’s because climate impacts come not just from what we emit today and tomorrow, but from the cumulative emissions that build up in the atmosphere over time. And they warn that by focusing on targets three and a half decades into the future – rather than on what we can do to cut carbon sharply and immediately – there is a serious risk that we will allow our emissions to continue to soar for years to come, thereby blowing through far too much of our 2° “carbon budget” and putting ourselves in an impossible position later in the century.

Which is why Anderson and Bows argue that, if the governments of developed countries are serious about hitting the agreedupon international target of keeping warming below 2° Celsius, and if reductions are to respect any kind of equity principle (basically that the countries that have been spewing carbon for the better part of two centuries need to cut before the countries where more than a billion people still don’t have electricity), then the reductions need to be a lot deeper, and they need to come a lot sooner.

To have even a 50/50 chance of hitting the 2° target (which, they and many others warn, already involves facing an array of hugely damaging climate impacts), the industrialised countries need to start cutting their greenhouse-gas emissions by something like 10 per cent a year – and they need to start right now. But Anderson and Bows go further, pointing out that this target cannot be met with the array of modest carbon pricing or green-tech solutions usually advocated by big green groups. These measures will certainly help, to be sure, but they are simply not enough: a 10 per cent drop in emissions, year after year, is virtually unprecedented since we started powering our economies with coal. In fact, cuts above 1 per cent per year “have historically been associated only with economic recession or upheaval”, as the economist Nicholas Stern put it in his 2006 report for the British government.

Even after the Soviet Union collapsed, reductions of this duration and depth did not happen (the former Soviet countries experienced average annual reductions of roughly 5 per cent over a period of ten years). They did not happen after Wall Street crashed in 2008 (wealthy countries experienced about a 7 per cent drop between 2008 and 2009, but their CO2 emissions rebounded with gusto in 2010 and emissions in China and India had continued to rise). Only in the immediate aftermath of the great market crash of 1929 did the United States, for instance, see emissions drop for several consecutive years by more than 10 per cent annually, according to historical data from the Carbon Dioxide Information Analysis Centre. But that was the worst economic crisis of modern times.

If we are to avoid that kind of carnage while meeting our science-based emissions targets, carbon reduction must be managed carefully through what Anderson and Bows describe as “radical and immediate de-growth strategies in the US, EU and other wealthy nations”. Which is fine, except that we happen to have an economic system that fetishises GDP growth above all else, regardless of the human or ecological consequences, and in which the neoliberal political class has utterly abdicated its responsibility to manage anything (since the market is the invisible genius to which everything must be entrusted).

So what Anderson and Bows are really saying is that there is still time to avoid catastrophic warming, but not within the rules of capitalism as they are currently constructed. Which may be the best argument we have ever had for changing those rules.

In a 2012 essay that appeared in the influential scientific journal Nature Climate Change, Anderson and Bows laid down something of a gauntlet, accusing many of their fellow scientists of failing to come clean about the kind of changes that climate change demands of humanity. On this it is worth quoting the pair at length:

. . . in developing emission scenarios scientists repeatedly and severely underplay the implications of their analyses. When it comes to avoiding a 2°C rise, “impossible” is translated into “difficult but doable”, whereas “urgent and radical” emerge as “challenging” – all to appease the god of economics (or, more precisely, finance). For example, to avoid exceeding the maximum rate of emission reduction dictated by economists, “impossibly” early peaks in emissions are assumed, together with naive notions about “big” engineering and the deployment rates of low-carbon infrastructure. More disturbingly, as emissions budgets dwindle, so geoengineering is increasingly proposed to ensure that the diktat of economists remains unquestioned.

In other words, in order to appear reasonable within neoliberal economic circles, scientists have been dramatically soft-peddling the implications of their research. By August 2013, Anderson was willing to be even more blunt, writing that the boat had sailed on gradual change. “Perhaps at the time of the 1992 Earth Summit, or even at the turn of the millennium, 2°C levels of mitigation could have been achieved through significant evolutionary changes within the political and economic hegemony. But climate change is a cumulative issue! Now, in 2013, we in high-emitting (post-)industrial nations face a very different prospect. Our ongoing and collective carbon profligacy has squandered any opportunity for the ‘evolutionary change’ afforded by our earlier (and larger) 2°C carbon budget. Today, after two decades of bluff and lies, the remaining 2°C budget demands revolutionary change to the political and economic hegemony” (his emphasis).

We probably shouldn’t be surprised that some climate scientists are a little spooked by the radical implications of even their own research. Most of them were just quietly doing their work measuring ice cores, running global climate models and studying ocean acidification, only to discover, as the Australian climate expert and author Clive Hamilton puts it, that they “were unwittingly destabilising the political and social order”.

But there are many people who are well aware of the revolutionary nature of climate science. It’s why some of the governments that decided to chuck their climate commitments in favour of digging up more carbon have had to find ever more thuggish ways to silence and intimidate their nations’ scientists. In Britain, this strategy is becoming more overt, with Ian Boyd, the chief scientific adviser at the Department for Environment, Food and Rural Affairs, writing recently that scientists should avoid “suggesting that policies are either right or wrong” and should express their views “by working with embedded advisers (such as myself), and by being the voice of reason, rather than dissent, in the public arena”.

If you want to know where this leads, check out what’s happening in Canada, where I live. The Conservative government of Stephen Harper has done such an effective job of gagging scientists and shutting down critical research projects that, in July 2012, a couple thousand scientists and supporters held a mock-funeral on Parliament Hill in Ottawa, mourning “the death of evidence”. Their placards said, “No Science, No Evidence, No Truth”.

But the truth is getting out anyway. The fact that the business-as-usual pursuit of profits and growth is destabilising life on earth is no longer something we need to read about in scientific journals. The early signs are unfolding before our eyes. And increasing numbers of us are responding accordingly: blockading fracking activity in Balcombe; interfering with Arctic drilling preparations in Russian waters (at tremendous personal cost); taking tar sands operators to court for violating indigenous sovereignty; and countless other acts of resistance large and small. In Brad Werner’s computer model, this is the “friction” needed to slow down the forces of destabilisation; the great climate campaigner Bill McKibben calls it the “antibodies” rising up to fight the planet’s “spiking fever”.

It’s not a revolution, but it’s a start. And it might just buy us enough time to figure out a way to live on this planet that is distinctly less f**ked.


Naomi Klein, the author of “The Shock Doctrine” and “No Logo”, is working on a book and a film about the revolutionary power of climate change. You call follow her on twitter @naomiaklein


Get every new post delivered to your Inbox.

Join 1,523 other followers