How “Archie” went from dull to daring

The world’s tamest comic series is now our most groundbreaking

Archie used to be the safest, squarest comic book franchise out there. But in the past few years, something changed

How "Archie" went from dull to daring: The world's tamest comic series is now our most groundbreaking

Like a lot of people, I used to get those little “Archie” digests at the supermarket when I was a lad. I remember enjoying them, but they didn’t have a big impact on me. Archie, Betty, Veronica, Jughead and the rest of the gang are clearly part of the collective unconscious, but they’ve never felt like essential reading. When I drifted away from comics for a while, books like “Maus” and “Watchmen” and “Daredevil: Born Again” stayed with me, but my Archies were the first to go. They felt disposable because the characters never changed. Nobody played it safer than Archie Comics.

Those days are a distant memory. Archie Comics is now known for taking wild chances and daring approaches that put Marvel and DC to shame. The debut of “Chilling Adventures of Sabrina” and the announcement of the batshit crossover “Archie Meets Predator” highlight what’s been apparent for years now: The company formerly known for the squarest and most unchanging characters in comics has become one of the most adventurous and exciting publishers. From the zombie apocalypse to a forthcoming story by Lena Dunham, today’s Archie Comics are anything but disposable or predictable. Improbably, anything goes in Riverdale.

A brand new series—“Chilling Adventures of Sabrina”—is the latest evidence of Archie’s willingness to take a radical new direction with an old character. Sabrina has been popular for decades, perhaps even more than Archie Andrews himself, thanks to the successful Sabrina series that aired on ABC and the WB. But Sabrina’s adventures, like Archie’s, have usually been fairly innocent teenage fare. Not anymore. Sabrina and her world have taken a more serious and historical turn in the new series written by Archie’s Chief Creative Officer Roberto Aguirre-Sacasa and illustrated by Robert Hack: they bring surprising gravitas and depth to the concept of a teenage witch, giving Sabrina a backstory full of tragedy while keeping the teen shenanigans. Hack’s art reminds me a little of Tula Lotay’s surreal, boundary-smashing work on “Supreme Blue Rose.” There’s a dreamy feel—or maybe I should say a nightmarish feel—that also fits the series’ specific historical setting, starting in 1951 with Sabrina’s birth. The best trick in this magic-filled book is that even in an older setting with darker horror, Sabrina is still Sabrina.



That trick was perfected in the ongoing series that inspired this new spin on Sabrina: “Afterlife with Archie,” which makes Riverdale’s zombie contagion feel appropriately deadly while maintaining the essence of the characters. “Afterlife” is a genuinely moving comic and a helluva accomplishment, thanks in no small part to the moody, evocative art of Francesco Francavilla, whose visuals create a recognizable yet new Riverdale that’s about as safe as the prison from “The Walking Dead.” The critical and commercial success of “Afterlife” led to “Sabrina,” much as the title of “Afterlife” plays on “Life with Archie”—a series that followed two parallel versions of Archie: one who married Betty and one who married Veronica. Both universes culminated in the death of Archie earlier this year. Marriage, death, zombies, alternate universes: Archie has embraced the biggest possibilities of both real life and comic books.

I asked Archie co-CEO/publisher Jon Goldwater about the company’s innovations, and he said they have a “story first” philosophy, but “don’t want to feel limited or tied down by what’s come before or what anyone else is doing.” He says the current era began about six years ago when he told editors and creators that “everything was on the table. No idea was too crazy and nothing was too precious.” From that meeting came Kevin Keller, who Goldwater believes is “the most important new character at Archie since the original five of Archie, Betty, Veronica, Jughead and Reggie.” Keller, like older members of the gang, also appears in multiple versions and universes: he’s already been a superhero and senator.

The company’s confidence is reflected in the recent announcement of “Archie Meets Predator,” which might be the weirdest team-up or mash-up by any publisher. But there is a precedent for this series (a collaboration with Dark Horse Comics) at Archie. As Chris Sims discussed in Comics Alliance, the company’s new creativity isn’t entirely unprecedented: there have been some crazy Archie stories over the years. The best is probably “Archie Meets the Punisher,” a combination of the most unlikely genres imaginable: teen soap opera and vigilante pulp. That’s like “Gilmore Girls” and “Dexter” having a crossover. There have also been “Archie Meets KISS” and “Archie Meets Glee,” so this is another area of the Archie-verse that’s open-ended to say the least.

Archie Comics is taking chances with less-familiar characters too. Though not well-known, Archie also owns some superhero characters, which they’re rebooting with a new line called Dark Circle: they will include the Fox, the Black Hood and the Shield. The Shield is an especially noteworthy character for reasons old and new. Created in 1940, the Shield is an all-American hero in the vein of Captain America—but who preceded Captain America by a year. In fact, the creators of Captain America changed the original shape of Cap’s shield to avoid confusion with the Shield. For the new Shield series, a woman will be taking up the mantle of this underappreciated hero. Like the female Thor and books like “Rat Queens,” “Harley Quinn” and “She-Hulk,” the new Shield is an example of growing female presence in comic book characters and fandom.

Archie is learning something DC is figuring out with series like “Batman ’66,” a continuation of Adam West Batman: readers are cool with multiple, inconsistent, far-out versions of beloved characters. Also, when you free a character from the prison of continuity—the tangled web of what really happened and supposed counts in the main version of a character—you’re free to tell better stories with greater consequences. When you let stories stand on their own, you can marry Archie, or kill him, or make him fight Predator. Now that the elasticity of the Archie crew has been embraced, it’s hard to imagine any genre or team-up that’s not fair game. Sci-fi Archie? Archie vs. Archer? Who knows?

Anything seems possible in Riverdale. Young me would have been shocked to read that sentence. Who would have guessed wholesome, simple, predictable Archie Andrews would end up the poster boy for the bizarre, complex, freewheeling possibilities of comics?

Mark Peters is a freelance writer from Chicago. He writes jokes on Twitter and is a columnist for Visual Thesaurus and McSweeney’s. He is also Comic Book Fella on Tumblr.

http://www.salon.com/2014/10/25/how_archie_went_from_dull_to_daring_the_worlds_tamest_comic_series_is_now_our_most_groundbreaking/?source=newsletter

Friedrich Nietzsche on Why a Fulfilling Life Requires Embracing Rather than Running from Difficulty

by

A century and a half before our modern fetishism of failure, a seminal philosophical case for its value.

German philosopher, poet, composer, and writer Friedrich Nietzsche (October 15, 1844–August 25, 1900) is among humanity’s most enduring, influential, and oft-cited minds — and he seemed remarkably confident that he would end up that way. Nietzsche famously called the populace of philosophers “cabbage-heads,” lamenting: “It is my fate to have to be the first decent human being. I have a terrible fear that I shall one day be pronounced holy.” In one letter, he considered the prospect of posterity enjoying his work: “It seems to me that to take a book of mine into his hands is one of the rarest distinctions that anyone can confer upon himself. I even assume that he removes his shoes when he does so — not to speak of boots.”

A century and a half later, Nietzsche’s healthy ego has proven largely right — for a surprising and surprisingly modern reason: the assurance he offers that life’s greatest rewards spring from our brush with adversity. More than a century before our present celebration of “the gift of failure” and our fetishism of failure as a conduit to fearlessness, Nietzsche extolled these values with equal parts pomp and perspicuity.

In one particularly emblematic specimen from his many aphorisms, penned in 1887 and published in the posthumous selection from his notebooks, The Will to Power (public library), Nietzsche writes under the heading “Types of my disciples”:

To those human beings who are of any concern to me I wish suffering, desolation, sickness, ill-treatment, indignities — I wish that they should not remain unfamiliar with profound self-contempt, the torture of self-mistrust, the wretchedness of the vanquished: I have no pity for them, because I wish them the only thing that can prove today whether one is worth anything or not — that one endures.

(Half a century later, Willa Cather echoed this sentiment poignantly in a troubled letter to her brother: “The test of one’s decency is how much of a fight one can put up after one has stopped caring.”)

With his signature blend of wit and wisdom, Alain de Botton — who contemplates such subjects as the psychological functions of art and what literature does for the soul — writes in the altogether wonderful The Consolations of Philosophy (public library):

Alone among the cabbage-heads, Nietzsche had realized that difficulties of every sort were to be welcomed by those seeking fulfillment.

Not only that, but Nietzsche also believed that hardship and joy operated in a kind of osmotic relationship — diminishing one would diminish the other — or, as Anaïs Nin memorably put it, “great art was born of great terrors, great loneliness, great inhibitions, instabilities, and it always balances them.” In The Gay Science (public library), his treatise on poetry where his famous “God is dead” proclamation was coined, he wrote:

What if pleasure and displeasure were so tied together that whoever wanted to have as much as possible of one must also have as much as possible of the other — that whoever wanted to learn to “jubilate up to the heavens” would also have to be prepared for “depression unto death”?

You have the choice: either as little displeasure as possible, painlessness in brief … or as much displeasure as possible as the price for the growth of an abundance of subtle pleasures and joys that have rarely been relished yet? If you decide for the former and desire to diminish and lower the level of human pain, you also have to diminish and lower the level of their capacity for joy.

He was convinced that the most notable human lives reflected this osmosis:

Examine the lives of the best and most fruitful people and peoples and ask yourselves whether a tree that is supposed to grow to a proud height can dispense with bad weather and storms; whether misfortune and external resistance, some kinds of hatred, jealousy, stubbornness, mistrust, hardness, avarice, and violence do not belong among the favorable conditions without which any great growth even of virtue is scarcely possible.

De Botton distills Nietzsche’s convictions and their enduring legacy:

The most fulfilling human projects appeared inseparable from a degree of torment, the sources of our greatest joys lying awkwardly close to those of our greatest pains…

Why? Because no one is able to produce a great work of art without experience, nor achieve a worldly position immediately, nor be a great lover at the first attempt; and in the interval between initial failure and subsequent success, in the gap between who we wish one day to be and who we are at present, must come pain, anxiety, envy and humiliation. We suffer because we cannot spontaneously master the ingredients of fulfillment.

Nietzsche was striving to correct the belief that fulfillment must come easily or not at all, a belief ruinous in its effects, for it leads us to withdraw prematurely from challenges that might have been overcome if only we had been prepared for the savagery legitimately demanded by almost everything valuable.

(Or, as F. Scott Fitzgerald put it in his atrociously, delightfully ungrammatical proclamation, “Nothing any good isn’t hard.”)

Nietzsche arrived at this ideas the roundabout way. As a young man, he was heavily influenced by Schopenhauer. At the age of twenty-one, he chanced upon Schopenhauer’s masterwork The World as Will and Representation and later recounted this seminal life turn:

I took it in my hand as something totally unfamiliar and turned the pages. I do not know which demon was whispering to me: ‘Take this book home.’ In any case, it happened, which was contrary to my custom of otherwise never rushing into buying a book. Back at the house I threw myself into the corner of a sofa with my new treasure, and began to let that dynamic, dismal genius work on me. Each line cried out with renunciation, negation, resignation. I was looking into a mirror that reflected the world, life and my own mind with hideous magnificence.

And isn’t that what the greatest books do for us, why we read and write at all? But Nietzsche eventually came to disagree with Schopenhauer’s defeatism and slowly blossomed into his own ideas on the value of difficulty. In an 1876 letter to Cosima Wagner — the second wife of the famed composer Richard Wagner, whom Nietzsche had befriended — he professed, more than a decade after encountering Schopenhauer:

Would you be amazed if I confess something that has gradually come about, but which has more or less suddenly entered my consciousness: a disagreement with Schopenhauer’s teaching? On virtually all general propositions I am not on his side.

This turning point is how Nietzsche arrived at the conviction that hardship is the springboard for happiness and fulfillment. De Botton captures this beautifully:

Because fulfillment is an illusion, the wise must devote themselves to avoiding pain rather than seeking pleasure, living quietly, as Schopenhauer counseled, ‘in a small fireproof room’ — advice that now struck Nietzsche as both timid and untrue, a perverse attempt to dwell, as he was to put it pejoratively several years later, ‘hidden in forests like shy deer.’ Fulfillment was to be reached not by avoiding pain, but by recognizing its role as a natural, inevitable step on the way to reaching anything good.

And this, perhaps, is the reason why nihilism in general, and Nietzsche in particular, has had a recent resurgence in pop culture — the subject of a fantastic recent Radiolab episode. The wise and wonderful Jad Abumrad elegantly captures the allure of such teachings:

All this pop-nihilism around us is not about tearing down power structures or embracing nothingness — it’s just, “Look at me! Look how brave I am!”

Quoting Nietzsche, in other words, is a way for us to signal others that we’re unafraid, that difficulty won’t break us, that adversity will only assure us.

And perhaps there is nothing wrong with that. After all, Viktor Frankl was the opposite of a nihilist, and yet we flock to him for the same reason — to be assured, to be consoled, to feel like we can endure.

The Will to Power remains indispensable and The Consolations of Philosophy is excellent in its totality. Complement them with a lighter serving of Nietzsche — his ten rules for writers, penned in a love letter.

 

http://www.brainpickings.org/2014/10/15/nietzsche-on-difficulty/

Cornel West was right all along: Why America needs a moment of clarity now

It’s sad that it’s come to this. But social justice in America is officially stunted — and there’s only one answer

Cornel West was right all along: Why America needs a moment of clarity <em>now</em>
Cornel West is knocked over during a scuffle with police during a protest in Ferguson, Missouri, October 13, 2014. (Credit: Reuters/Jim Young)

This past weekend, protestors and activists, including students, clergy, and concerned citizens descended on Ferguson, Missouri, for a weekend of marches, protests, sit-ins, and civil disobedience billed Ferguson October. Yesterday as one activist, Patrisse Cullors-Brignac, an organizer with Black Lives Matter tweeted exultantly from the inside of a St. Louis County jail cell about having taken over a Wal-Mart, I waited to hear news of her safe release alongside rapper-activist Tef Poe, attorney and legal observer Justin Hansford, and Palestinian activist Bassem Masri, who handled live streaming for the Organization for Black Struggle.

Of course, many on Twitter, could not understand why disturbing the peace in a private business should be acceptable. The point is – we are no longer standing for business as usual. Lest we forget, racial segregation of old happened in “private” businesses, too, in stores like Woolworth’s and Hecht’s. That John Crawford could not step into a Beaver Creek, Ohio Wal-Mart, wander aimlessly, as so many of us have done on a casual shopping trip, and reasonably expect to come out alive, suggests that time is out for business as usual.

This nation proclaims Black America backwards, sees us as stuck in the past, declares that our obsession with race –not racism itself – is holding back progress. But really, Black people are the hands on our national clock, controlling the timing of America’s social progress. This time both hands are up, pointing straight to the sky. It is high noon in America. It is the deepest midnight in our hearts.  But, “from the darkness cometh the light,” Lucy A. Delaney proclaimed in the title of her 1891 autobiography. Until those hands move, no longer held hostage by a state-issued weapon, nobody is going anywhere.

Biblical platitudes don’t go far with this crowd, but I’m reminded of a verse: “it is high time to awake out of sleep. (Romans 13:11)”

In college, some youth government leaders used this verse with the slogan, “The Awakening. Don’t sleep.” These days, young folks have remixed it, proclaiming “stay woke.” I am sitting with what it means that we have moved from “Don’t Sleep to Stay Woke.”

For we have awakened from a long, fitful slumber. Lulled there by our parents and grandparents, who marched in Selma, sat down in Greensboro, matriculated at Black colleges, and argued before the Supreme Court, they convinced us to adopt their freedom dreams, impressed them into our bodies, in every hug, in every $25 check pressed into a hand from a grandmother to a grandchild on his or her way to bigger and better, in every whispered prayer, in every indignity suffered silently but resolutely in the workplace.



We slept so long our dreams have become nightmares.

Langston Hughes famously asked, “what happens to a dream deferred?

Does it dry up

like a raisin in the sun?

Or fester like a sore—

And then run?

Does it stink like rotten meat?

Or crust and sugar over—

like a syrupy sweet?

Maybe it just sags

like a heavy load.

Or does it explode?

This Ferguson October, young people are on the ground dreaming new dreams, and in so doing, they are inspiring elders. They are creative, taking over public spaces, not only with signs, and chants, but with impromptu games of twister and double-dutch. From them we learn that play can be political, that there is joy in struggle, that there is no justice without pleasure.

They are lining up, linking arms, and being locked up for justice. They are listening to those who have something to say, and shutting down shit when forced to listen to anyone who doesn’t. They are choosing their leaders, their griots, their truth-tellers, their strategists, their elders.  Showing up matters most. Putting one’s body on the line is the order of the day. They are undignified, improper, unabashed, impolitic, unapologetic, indefatigable.

This weekend they took over four Wal-Marts, in solidarity with John Crawford who was murdered in an Ohio Wal-Mart. There, prosecutors have cleared officers of wrongdoing. Protestors took signs to the St. Louis Rams game, and confronted angry fans who yelled, “I am Darren Wilson.” Two weeks ago, they disrupted the symphony. Exploding dreams cause disruptions. They should be expected to continue.

More than 3000 new registered voters move among them (Update: This number has since been revised downward to 128). They have collected these new registrations like so many arrows in a quiver. Still they remain skeptical of the vote. And since the presence of Black faces at the voting booth scares white people they should be.  Take Tef Poe’s surly response to Senator Claire McCaskill at a PBS townhall in late September. She argued that we needed new pipelines of local African American leadership. He replied, “I voted for Barack Obama twice, and still got tear-gassed.”

We could mince words about the vast differences and expansive power of local governments in these matters, over against the apparently limited power of federal jurisdiction. But that’s kind of beside the point. Movements are as much about symbols as about substance. And Barack Obama is a broken symbol, a clanging cymbal, unable to say and do anything of use. His silence is the sound of imploding dreams, his words mere distractions and detours from the future we want.

He has become a prime example that being the leader of the free world in a Black body is still no match for entrenched, local, systemic, committed racism.  It’s sad that it has come to this.  But this is bigger than Barack Obama. Just like it was bigger than King and his dream. We have awakened from sleep. We have been startled out of it by nearly 30 gunshots ringing out insistently from the heart of America. Jay-Z might call it “a moment of clarity.” In Obama’s place, Cornel West has re-emerged, the wise and fearless elder, the one who we tried not to listen to, as he screamed into the wind for six years, the one whose approach chafed my hide on more than a few occasions, the one who is — despite all of our collective quibbles and begrudgements – right.

This moment is about all of us. About what kind of America we want to be. About what kind of America we are willing to be, willing to fight for. About whether we will settle for being mediocre and therefore murderous to a whole group of citizens. About whether there are other versions of ourselves worth fighting for.

Don’t sleep. Millennials, it seems, are the ones we have been waiting for.  Fearless and focused, the future they are fighting for is one I want. It is high time to awake out of sleep. Stay woke.

Brittney Cooper is a contributing writer at Salon, and teaches Women’s and Gender Studies and Africana Studies at Rutgers. Follow her on Twitter at @professorcrunk.

http://www.salon.com/2014/10/15/cornel_west_was_right_all_along_why_america_needs_a_moment_of_clarity_now/?source=newsletter

The untold history of America this Columbus Day

North America is a crime scene:

The founding myth of the United States is a lie. It is time to re-examine our ruthless past — and present

North America is a crime scene: The untold history of America this Columbus Day

That the continued colonization of American Indian nations,  peoples, and lands provides the United States the economic  and material resources needed to cast its imperialist gaze globally is a fact that is simultaneously obvious within—and yet continually obscured by—what is essentially a settler  colony’s national construction of itself as an ever more perfect multicultural, multiracial democracy. . . . [T]he status of  American Indians as sovereign nations colonized by the  United States continues to haunt and inflect its raison d’etre. —Jodi Byrd

The conventional narrative of U.S. history routinely segregates the  “Indian wars” as a subspecialization within the dubious category  “the West.” Then there are the westerns, those cheap novels, movies, and television shows that nearly every American imbibed with mother’s milk and that by the mid-twentieth century were popular  in every corner of the world.  The architecture of US world dominance was designed and tested by this period of continental U.S. militarism, which built on the previous hundred years and generated its own innovations in total war. The opening of the twenty-first century saw a new, even more brazen form of U.S. militarism and imperialism explode on the world scene when the election of George W. Bush turned over control of U.S. foreign policy to a long-gestating neoconservative and warmongering faction of the Pentagon and its civilian hawks. Their subsequent eight years of political control included two major military invasions and hundreds of small wars employing U.S. Special Forces around the globe, establishing a template that continued after their political power waned.

Injun Country

One highly regarded military analyst stepped forward to make the connections between the “Indian wars” and what he considered the country’s bright imperialist past and future. Robert D. Kaplan, in his 2005 book Imperial Grunts, presented several case studies that he considered highly successful operations: Yemen, Colombia, Mongolia, and the Philippines, in addition to ongoing complex projects in the Horn of Africa, Afghanistan, and Iraq. While US citizens and many of their elected representatives called for ending the US military interventions they knew about—including Iraq and Afghanistan—Kaplan hailed protracted counterinsurgencies in Africa, Asia, the Middle East, Latin America, and the Pacific. He presented a guide for the U.S. controlling those areas of the world based on its having achieved continental dominance in North America by means of counterinsurgency and employing total and unlimited war.



Kaplan, a meticulous researcher and influential writer born in 1952 in New York City, wrote for major newspapers and magazines before serving as “chief geopolitical strategist” for the private security think tank Stratfor. Among other prestigious posts, he has been a senior fellow at the Center for a New American Security in Washington, D.C., and a member of the Defense Policy Board, a federal advisory committee to the US Department of Defense. In 2011, Foreign Policy magazine named Kaplan as one of the world’s “top 100 global thinkers.” Author of numerous best-selling books, including Balkan Ghosts and Surrender or Starve, Kaplan became one of the principal intellectual boosters for U.S. power in the world through the tried-and-true “American way of war.” This is the way of war dating to the British-colonial period that military historian John Grenier called a combination of “unlimited war and irregular war,” a military tradition “that accepted, legitimized, and encouraged attacks upon and the destruction of noncombatants, villages and agricultural resources . . . in shockingly violent campaigns to achieve their goals of conquest.”

Kaplan sums up his thesis in the prologue to Imperial Grunts, which he subtitles “Injun Country”:

By the turn of the twenty-first century the United States military had already appropriated the entire earth, and was ready to flood the most obscure areas of it with troops at a moment’s notice.

The Pentagon divided the planet into five area commands—similar to the way that the Indian Country of the American West had been divided in the mid-nineteenth century by the U.S. Army. . . . [A]ccording to the soldiers and marines I met on the ground in far-flung corners of the earth, the comparison with the nineteenth century was . . . apt. “Welcome to Injun Country” was the refrain I heard from troops from Colombia to the Philippines, including Afghanistan and Iraq. To be sure, the problem for the American military was less [Islamic] fundamentalism than anarchy. The War on Terrorism was really about taming the frontier.

Kaplan goes on to ridicule “elites in New York and Washington” who debate imperialism in “grand, historical terms,” while individuals from all the armed services interpret policy according to the particular circumstances they face and are indifferent to or unaware of the fact that they are part of an imperialist project. This book shows how colonialism and imperialism work.

Kaplan challenges the concept of manifest destiny, arguing that “it was not inevitable that the United States should have an empire in the western part of the continent.” Rather, he argues, western empire was brought about by “small groups of frontiersmen, separated from each other by great distances.” Here Kaplan refers to what Grenier calls settler “rangers,” destroying Indigenous towns and fields and food supplies. Although Kaplan downplays the role of the U.S. Army compared to the settler vigilantes, which he equates to the modern Special Forces, he acknowledges that the regular army provided lethal backup for settler counterinsurgency in slaughtering the buffalo, the food supply of Plains peoples, as well as making continuous raids on settlements to kill or confine the families of the Indigenous fighters. Kaplan summarizes the genealogy of U.S. militarism today:

Whereas the average American at the dawn of the new millennium found patriotic inspiration in the legacies of the Civil War and World War II, when the evils of slavery and fascism were confronted and vanquished, for many commissioned and noncommissioned officers the U.S. Army’s defining moment was fighting the “Indians.”

The legacy of the Indian wars was palpable in the numerous military bases spread across the South, the Middle West, and particularly the Great Plains: that vast desert and steppe comprising the Army’s historical “heartland,” punctuated by such storied outposts as Forts Hays, Kearney, Leavenworth, Riley, and Sill. Leavenworth, where the Oregon and Santa Fe trails separated, was now the home of the Army’s Command and General Staff College; Riley, the base of George Armstrong Custer’s 7th Cavalry, now that of the 1st Infantry Division; and Sill, where Geronimo lived out the last years of his life, the headquarters of the U.S. Artillery. . . .

While microscopic in size, it was the fast and irregular military actions against the Indians, memorialized in bronze and oil by Remington, that shaped the nature of American nationalism.

Although Kaplan relies principally on the late-nineteenth-century source of US counterinsurgency, in a footnote he reports what he learned at the Airborne Special Operations Museum in Fayetteville, North Carolina: “It is a small but interesting fact that members of the 101st Airborne Division, in preparation for their parachute drop on D-Day, shaved themselves in Mohawk style and applied war paint on their faces.” This takes us back to the pre-independence colonial wars and then through US independence and the myth popularized by The Last of the Mohicans.

Kaplan debunks the argument that the attacks on the World Trade Center and the Pentagon on September 11, 2001, brought the United States into a new era of warfare and prompted it to establish military bases around the world. Prior to 2001, Kaplan rightly observes, the US Army’s Special Operations Command had been carrying out maneuvers since the 1980s in “170 countries per year, with an average of nine ‘quiet professionals’ on each mission. America’s reach was long; its involvement in the obscurest states protean. Rather than the conscript army of citizen soldiers that fought World War II, there was now a professional military that, true to other imperial forces throughout history, enjoyed the soldiering life for its own sake.”

On October 13, 2011, testifying before the Armed Services Committee of the US House of Representatives, General Martin Dempsey stated: “I didn’t become the chairman of the Joint Chiefs to oversee the decline of the Armed Forces of the United States, and an end state that would have this nation and its military not be a global power. . . . That is not who we are as a nation.”

The Return of Legalized Torture

Bodies—tortured bodies, sexually violated bodies, imprisoned bodies, dead bodies—arose as a primary topic in the first years of the George W. Bush administration following the September 2001 attacks with a war of revenge against Afghanistan and the overthrow of the government of Iraq. Afghans resisting U.S. forces and others who happened to be in the wrong place at the wrong time were taken into custody, and most of them were sent to a hastily constructed prison facility on the U.S. military base at Guantánamo Bay, Cuba, on land the United States appropriated in its 1898 war against Cuba. Rather than bestowing the status of prisoner of war on the detainees, which would have given them certain rights under the Geneva Conventions, they were designated as “unlawful combatants,” a status previously unknown in the annals of Western warfare. As such, the detainees were subjected to torture by U.S. interrogators and shamelessly monitored by civilian psychologists and medical personnel.

In response to questions and condemnations from around the globe, a University of California international law professor, John C. Yoo, on leave to serve as assistant U.S. attorney general in the Justice Department’s Office of Legal Counsel, penned in March 2003 what became the infamous “Torture Memo.” Not much was made at the time of one of the precedents Yoo used to defend the designation “unlawful combatant,” the US Supreme Court’s 1873 opinion in Modoc Indian Prisoners.

In 1872, a group of Modoc men led by Kintpuash, also known as Captain Jack, attempted to return to their own country in Northern California after the U.S. Army had rounded them up and forced them to share a reservation in Oregon. The insurgent group of fifty-three was surrounded by U.S. troops and Oregon militiamen and forced to take refuge in the barren and rugged lava beds around Mount Lassen, a dormant volcano, a part of their ancestral homeland that they knew every inch of. More than a thousand troops commanded by General Edward R. S. Canby, a former Civil War general, attempted to capture the resisters, but had no success as the Modocs engaged in effective guerrilla warfare. Before the Civil War, Canby had built his military career fighting in the Second Seminole War and later in the invasion of Mexico. Posted to Utah on the eve of the Civil War, he had led attacks against the Navajos, and then began his Civil War service in New Mexico. Therefore, Canby was a seasoned Indian killer. In a negotiating meeting between the general and Kintpuash, the Modoc leader killed the general and the other commissioners when they would allow only for surrender. In response, the United States sent another former Civil War general in with more than a thousand additional soldiers as reinforcements, and in April 1873 these troops attacked the Modoc stronghold, this time forcing the Indigenous fighters to flee. After four months of fighting that cost the United States almost $500,000—equal to nearly $10 million currently—and the lives of more than four hundred of its soldiers and a general, the nationwide backlash against the Modocs was vengeful. Kintpuash and several other captured Modocs were imprisoned and then hanged at Alcatraz, and the Modoc families were scattered and incarcerated on reservations. Kintpuash’s corpse was embalmed and exhibited at circuses around the country. The commander of the army’s Pacific Military Division at the time, Lieutenant General John M. Schofield, wrote of the Modoc War in his memoir, Forty-Six Years in the Army: “If the innocent could be separated from the guilty, plague, pestilence, and famine would not be an unjust punishment for the crimes committed in this country against the original occupants of the soil.”

Drawing a legal analogy between the Modoc prisoners and the Guantánamo detainees, Assistant U.S. Attorney General Yoo employed the legal category of homo sacer—in Roman law, a person banned from society, excluded from its legal protections but still subject to the sovereign’s power. Anyone may kill a homo sacer without it being considered murder. As Jodi Byrd notes, “One begins to understand why John C. Yoo’s infamous March 14, 2003, torture memos cited the 1865 Military Commissions and the 1873 The Modoc Indian Prisoners legal opinions in order to articulate executive power in declaring the state of exception, particularly when The Modoc Indian Prisoners opinion explicitly marks the Indian combatant as homo sacer to the United States.” To buttress his claim, Yoo quoted from the 1873 Modoc Indian Prisoners opinion:

It cannot be pretended that a United States soldier is guilty of murder if he kills a public enemy in battle, which would be the case if the municipal law were in force and applicable to an act committed under such circumstances. All the laws and customs of civilized warfare may not be applicable to an armed conflict with the Indian tribes upon our western frontier; but the circumstances attending the assassination of Canby [Army general] and Thomas [U.S. peace commissioner] are such as to make their murder as much a violation of the laws of savage as of civilized warfare, and the Indians concerned in it fully understood the baseness and treachery of their act.

Byrd points out that, according to this line of thinking, anyone who could be defined as “Indian” could thus be killed legally, and they also could be held responsible for crimes they committed against any US soldier. “As a result, citizens of American Indian nations become in this moment the origin of the stateless terrorist combatant within U.S. enunciations of sovereignty.”

Ramped Up Militarization

The Chagos Archipelago comprises more than sixty small coral islands isolated in the Indian Ocean halfway between Africa and Indonesia, a thousand miles south of the nearest continent, India. Between 1968 and 1973, the United States and Britain, the latter the colonial administrator, forcibly removed the indigenous inhabitants of the islands, the Chagossians. Most of the two thousand deportees ended up more than a thousand miles away in Mauritius and the Seychelles, where they were thrown into lives of poverty and forgotten. The purpose of this expulsion was to create a major U.S. military base on one of the Chagossian islands, Diego Garcia. As if being rounded up and removed from their homelands in the name of global security were not cruel enough, before being deported the Chagossians had to watch as British agents and U.S. troops herded their pet dogs into sealed sheds where they were gassed and burned. As David Vine writes in his chronicle of this tragedy:

“The base on Diego Garcia has become one of the most secretive and powerful U.S. military facilities in the world, helping to launch the invasions of Afghanistan and Iraq (twice), threatening Iran, China, Russia, and nations from southern Africa to southeast Asia, host to a secret CIA detention center for high-profile terrorist suspects, and home to thousands of U.S. military personnel and billions of dollars in deadly weaponry.”

The Chagossians are not the only indigenous people around the world that the US military has displaced. The military established a pattern during and after the Vietnam War of forcibly removing indigenous peoples from sites deemed strategic for the placement of military bases. The peoples of the Bikini Atoll in the South Pacific and Puerto Rico’s Vieques Island are perhaps the best-known examples, but there were also the Inughuit of Thule, Greenland, and the thousands of Okinawans and Indigenous peoples of Micronesia. During the harsh deportation of the Micronesians in the 1970s, the press took some notice. In response to one reporter’s question, Secretary of State Henry Kissinger said of the Micronesians: “There are only ninety thousand people out there. Who gives a damn?” This is a statement of permissive genocide.

By the beginning of the twenty-first century, the United States operated more than 900 military bases around the world, including 287 in Germany, 130 in Japan, 106 in South Korea, 89 in Italy, 57 in the British Isles, 21 in Portugal, and 19 in Turkey. The number also comprised additional bases or installations located in Aruba, Australia, Djibouti, Egypt, Israel, Singapore, Thailand, Kyrgyzstan, Kuwait, Qatar, Bahrain, the United Arab Emirates, Crete, Sicily, Iceland, Romania, Bulgaria, Honduras, Colombia, and Cuba (Guantánamo Bay), among many other locations in some 150 countries, along with those recently added in Iraq and Afghanistan.

In her book The Militarization of Indian Country, Anishinaabe activist and writer Winona LaDuke analyzes the continuing negative effects of the military on Native Americans, considering the consequences wrought on Native economy, land, future, and people, especially Native combat veterans and their families. Indigenous territories in New Mexico bristle with nuclear weapons storage, and Shoshone and Paiute territories in Nevada are scarred by decades of aboveground and underground nuclear weapons testing. The Navajo Nation and some New Mexico Pueblos have experienced decades of uranium strip mining, the pollution of water, and subsequent deadly health effects. “I am awed by the impact of the military on the world and on Native America,” LaDuke writes. “It is pervasive.”

Political scientist Cynthia Enloe, who specializes in US foreign policy and the military, observes that US culture has become even more militarized since the attacks on the World Trade Center and the Pentagon. Her analysis of this trend draws on a feminist perspective:

Militarization . . . [is] happening at the individual level, when a woman who has a son is persuaded that the best way she can be a good mother is to allow the military recruiter to recruit her son so her son will get off the couch. When she is persuaded to let him go, even if reluctantly, she’s being militarized. She’s not as militarized as somebody who is a Special Forces soldier, but she’s being militarized all the same. Somebody who gets excited because a jet bomber flies over the football stadium to open the football season and is glad that he or she is in the stadium to see it, is being militarized. So militarization is not just about the question “do you think the military is the most important part of the state?” (although obviously that matters). It’s not just “do you think that the use of collective violence is the most effective way to solve social problems?”—which is also a part of militarization. But it’s also about ordinary, daily culture, certainly in the United States.

As John Grenier notes, however, the cultural aspects of militarization are not new; they have deep historical roots, reaching into the nation’s British-colonial past and continuing through unrelenting wars of conquest and ethnic cleansing over three centuries.

“Beyond its sheer military utility, Americans also found a use for the first way of war in the construction of an ‘American identity.’. . . [T]he enduring appeal of the romanticized myth of the ‘settlement’ (not the conquest) of the frontier, either by ‘actual’ men such as Robert Rogers or Daniel Boone or fictitious ones like Nathaniel Bumppo of James Fenimore Cooper’s creation, points to what D. H. Lawrence called the ‘myth of the essential white American.’”

The astronomical number of firearms owned by U.S. civilians, with the Second Amendment as a sacred mandate, is also intricately related to militaristic culture. Everyday life and the culture in general are damaged by ramped-up militarization, and this includes academia, particularly the social sciences, with psychologists and anthropologists being recruited as advisors to the military. Anthropologist David H. Price, in his indispensable book Weaponizing Anthropology, remarks that “anthropology has always fed between the lines of war.” Anthropology was born of European and U.S. colonial wars. Price, like Enloe, sees an accelerated pace of militarization in the early twenty-first century: “Today’s weaponization of anthropology and other social sciences has been a long time coming, and post-9/11 America’s climate of fear coupled with reductions in traditional academic funding provided the conditions of a sort of perfect storm for the militarization of the discipline and the academy as a whole.”

In their ten-part cable television documentary series and seven-hundred-page companion book The Untold History of the United States, filmmaker Oliver Stone and historian Peter Kuznick ask: “Why does our country have military bases in every region of the globe, totaling more than a thousand by some counts? Why does the United States spend as much money on its military as the rest of the world combined? Why does it still possess thousands of nuclear weapons, many on hair-trigger alert, even though no nation poses an imminent threat?” These are key questions. Stone and Kuznick condemn the situation but do not answer the questions. The authors see the post–World War II development of the United States into the world’s sole superpower as a sharp divergence from the founders’ original intent and historical development prior to the mid-twentieth century. They quote an Independence Day speech by President John Quincy Adams in which he condemned British colonialism and claimed that the United States “goes not abroad, in search of monsters to destroy.” Stone and Kuznick fail to mention that the United States at the time was invading, subjecting, colonizing, and removing the Indigenous farmers from their land, as it had since its founding and as it would through the nineteenth century. In ignoring that fundamental basis for US development as an imperialist power, they do not see that overseas empire was the logical outcome of the course the United States chose at its founding.

North America is a Crime Scene

Jodi Byrd writes: “The story of the new world is horror, the story of America a crime.” It is necessary, she argues, to start with the origin of the United States as a settler-state and its explicit intention to occupy the continent. These origins contain the historical seeds of genocide. Any true history of the United States must focus on what has happened to (and with) Indigenous peoples—and what still happens. It’s not just past colonialist actions but also “the continued colonization of American Indian nations, peoples, and lands” that allows the United States “to cast its imperialist gaze globally” with “what is essentially a settler colony’s national construction of itself as an ever more perfect multicultural, multiracial democracy,” while “the status of American Indians as sovereign nations colonized by the United States continues to haunt and inflect its raison d’etre.” Here Byrd quotes Lakota scholar Elizabeth Cook-Lynn, who spells out the connection between the “Indian wars” and the Iraq War:

The current mission of the United States to become the center of political enlightenment to be taught to the rest of the world began with the Indian wars and has become the dangerous provocation of this nation’s historical intent. The historical connection between the Little Big Horn event and the “uprising” in Baghdad must become part of the political dialogue of America if the fiction of decolonization is to happen and the hoped for deconstruction of the colonial story is to come about.

A “race to innocence” is what occurs when individuals assume that they are innocent of complicity in structures of domination and oppression. This concept captures the understandable assumption made by new immigrants or children of recent immigrants to any country. They cannot be responsible, they assume, for what occurred in their adopted country’s past. Neither are those who are already citizens guilty, even if they are descendants of slave owners, Indian killers, or Andrew Jackson himself. Yet, in a settler society that has not come to terms with its past, whatever historical trauma was entailed in settling the land affects the assumptions and behavior of living generations at any given time, including immigrants and the children of recent immigrants.

In the United States the legacy of settler colonialism can be seen in the endless wars of aggression and occupations; the trillions spent on war machinery, military bases, and personnel instead of social services and quality public education; the gross profits of corporations, each of which has greater resources and funds than more than half the countries in the world yet pay minimal taxes and provide few jobs for US citizens; the repression of generation after generation of activists who seek to change the system; the incarceration of the poor, particularly descendants of enslaved Africans; the individualism, carefully inculcated, that on the one hand produces self-blame for personal failure and on the other exalts ruthless dog-eat-dog competition for possible success, even though it rarely results; and high rates of suicide, drug abuse, alcoholism, sexual violence against women and children, homelessness, dropping out of school, and gun violence.

These are symptoms, and there are many more, of a deeply troubled society, and they are not new. The large and influential civil rights, student, labor, and women’s movements of the 1950s through the 1970s exposed the structural inequalities in the economy and the historical effects of more than two centuries of slavery and brutal genocidal wars waged against Indigenous peoples. For a time, US society verged on a process of truth seeking regarding past atrocities, making demands to end aggressive wars and to end poverty, witnessed by the huge peace movement of the 1970s and the War on Poverty, affirmative action, school busing, prison reform, women’s equity and reproductive rights, promotion of the arts and humanities, public media, the Indian Self-Determination Act, and many other initiatives.

A more sophisticated version of the race to innocence that helps perpetuate settler colonialism began to develop in social movement theory in the 1990s, popularized in the work of Michael Hardt and Antonio Negri. Commonwealth, the third volume in a trilogy, is one of a number of books in an academic fad of the early twenty-first century seeking to revive the Medieval European concept of the commons as an aspiration for contemporary social movements. Most writings about the commons barely mention the fate of Indigenous peoples in relation to the call for all land to be shared. Two Canadian scholar-activists, Nandita Sharma and Cynthia Wright, for example, do not mince words in rejecting Native land claims and sovereignty, characterizing them as xenophobic elitism. They see Indigenous claims as “regressive neo-racism in light of the global diasporas arising from oppression around the world.”

Cree scholar Lorraine Le Camp calls this kind of erasure of Indigenous peoples in North America “terranullism,” harking back to the characterization, under the Doctrine of Discovery, of purportedly vacant lands as terra nullis. This is a kind of no-fault history. From the theory of a liberated future of no borders and nations, of a vague commons for all, the theorists obliterate the present and presence of Indigenous nations struggling for their liberation from states of colonialism. Thereby, Indigenous rhetoric and programs for decolonization, nationhood, and sovereignty are, according to this project, rendered invalid and futile. From the Indigenous perspective, as Jodi Byrd writes, “any notion of the commons that speaks for and as indigenous as it advocates transforming indigenous governance or incorporating indigenous peoples into a multitude that might then reside on those lands forcibly taken from indigenous peoples does nothing to disrupt the genocidal and colonialist intent of the initial and now repeated historical process.”

Excerpted from “An Indigenous Peoples’ History of the United States” by Roxanne Dunbar-Ortiz (Beacon Press, 2014). Copyright 2014 by Roxanne Dunbar-Ortiz. Reprinted with permission of the publisher. All rights reserved.

http://www.salon.com/2014/10/13/north_america_is_a_crime_scene_the_untold_history_of_america/?source=newsletter

The incredible vanishing editor: What we can learn from Martin Scorsese’s new documentary

They’re invisible until something goes wrong, as in Alessandra-Stanley-gate. But we need editors now more than ever

 

The incredible vanishing editor: What we can learn from Martin Scorsese's new documentary
Hugh Eakin and Robert Silvers in “The 50 Year Argument” (Credit: HBO/Brigitte Lacombe)

Recently, the author Andrew Solomon, while describing the Amazon reader reviews for his monumental book, “Far From the Tree: Parents, Children and the Search for Identity,” recounted that one reviewer objected to having to pay more that $9.99 for something that consisted of nothing more than electronic bits. “What about the 11 years of my life it took to write it?” Solomon asked plaintively. “What about the months my editor spent working on it?”

That’s a particularly egregious example of how writing remains a form of invisible — or at least translucent — labor. Solomon’s research for “Far From the Tree” encompassed, among other things, interviews with over 300 families; you can see every hour he spent on the book right there on the page. But no matter how infatuated our culture has become with the digital, we still have a hard time appreciating the value of immaterial creations. Consumers who demand that the price of e-books be slashed to less than half the hardcover list price reveal a belief that the work and expertise of a writer are worth less than a handful of paper and cardboard.

And, oh, that poor editor! (“It’s Nan Graham,” Solomon explained in an email. “And I value her more than gold and silver and precious stones.”) If most readers do grudgingly admit that someone must write the books they purchase, they are less willing to concede that the vocation of helping to bring someone else’s writing into the world has much merit. “You are uninformed in that technology is making many of the functions of editor and publisher obsolete,” one commenter posted to a New York Times story about the publishing industry. “Writers should need little to no editing when computers check spelling, etc.”

Even readers who claim to value non-automated editing have little sense of a editor’s actual responsibilities. The familiar grouse that “no one edits anymore” is usually followed by lamentations over the typos, grammatical errors and misspellings someone has found in traditionally published works. But correcting that kind of micro-mistake is the job of a copyeditor (or in some cases a proofreader), not the editor. So if the editor is not in charge of fixing “spelling, etc,” then what does an editor do?



There’s an expression (taken from “Hamlet”) most often used to refer to a good rule that is frequently broken: honored in the breach. But the phrase might be better deployed to describe a mission whose value we appreciate most when it fails. Like a counterterrorism operative, a good editor had done her best work when you don’t realize she’s done anything at all.

It is only when editors fail — as was the case with, say, the three New York Times editors who gave a pass to Alessandra Stanley’s now-infamous “Angry Black Woman” article on Shonda Rhimes — that their importance becomes evident. The same could be said for A.O. Scott’s much-discussed New York Times Magazine essay, “The Death of Adulthood in American Culture,” which was much-discussed in large part because the through lines of Scott’s argument were muddled by the deployment of too many examples that didn’t really fit his thesis. In both cases, a more culturally knowledgeable editor would have invisibly fixed the problem before the piece ever saw print.

Martin Scorsese’s new film, “The Fifty-Year Argument” (currently being shown on HBO) is a rare tribute to another part of the editor’s job: Making writing happen in the first place. A documentary about the history of the venerable New York Review of Books, the film mostly focuses on the Review’s stellar contributors: Joan Didion, Zoe Heller, Daryl Pinckney, Daniel Mendelsohn, and many more. But interspersed with those interviews are quiet scenes that show the NYRB’s 85-year-old editor-in-chief, Robert Silvers, calling up writers and proposing assignments. It’s not great cinema, but it’s far more relevant to the Review’s stature than the more exciting archival footage of Norman Mailer shouting at the audience during a 1971 panel discussion on feminism.

“I needed him so much,” Didion tells the camera, recounting how Silvers had persuaded her to cover several Democratic and Republican conventions “not so much to walk me through it as to give me the confidence I could walk myself through it. … Bob involved me in writing about stuff I had no interest in.” As perverse as this strategy might sound, it does often work. That has been my own experience, particularly with Salon’s founding editor, David Talbot, who was always persuading me to tackle subjects (Michael Jackson) or to cover stories (a murder trial) that I considered off my beat; as a result I learned more than I would have if left to my own devices. Whatever the merits of my own stuff, the culture would be a lot poorer without Didion’s reporting on domestic politics. It shone in large part because, as someone who considered herself uninterested in the topic, she was able to bring a fresh, smart perspective to bear.

While this happens less often with books (it’s difficult to convince anyone to spend that much time working on a subject not of their own choosing), it does occur, albeit sometimes indirectly. James Baldwin’s “The Fire Next Time,” for example, although eventually published in book form, originated as an assignment to write about the Nation of Islam for Norman Podhoretz at Commentary. Editors also convince writers to bail on misbegotten projects and switch to more promising ones, or persuade them to reshape and recast books that aren’t working. Daniel Handler was struggling with a “mock-gothic novel for adults called ‘A Series of Unfortunate Events,’” when his agent sent his first and yet-unsold adult manuscript (“The Basic Eight” — highly recommended) to a children’s book editor, Susan Rich. She suggested he try writing for a younger audience, and Lemony Snicket was born.

Literary history celebrates many such cases, as well as instances when aggressive editorial “fat trimming” secured an author’s fame. The hierophant/editor Gordon Lish pared Raymond Carver’s short stories down to the bone, and even if Carver was not happy with this transformation, Lish’s scalpel played a major role in Carver’s exultation as the king of minimalism. Ezra Pound, in an unofficial capacity, performed the same wizardry on T.S. Eliot’s “The Waste Land.”

Maxwell Perkins, the editor who fought for (and with) such writers as Ernest Hemingway, F. Scott Fitzgerald and Thomas Wolfe, is regularly held up as the pinnacle of the editorial profession in years past. This praise is inevitably followed by the complaint that no one even tries to follow his example anymore. But then again, how would we know? It’s not a profession that attracts many spotlight hogs, and besides, the mystique of the solitary genius is so much more romantic than the reality that most good books are the result of a collaboration. I recently found myself sitting with three very talented, highly acclaimed novelists, so I asked them about their editors. “It amazes me,” said one. “They work so hard on our books, and then we get all the credit.” The other two writers nodded in agreement.

If editors like Perkins seem rare today, well, they were probably rare in the 1920s, as well. Not every writer gets the Maxwell Perkins treatment, but then again not every writer is F. Scott Fitzgerald. Furthermore, if every bad book can be held up as evidence that editors are falling down on their jobs, then every good book ought to be admitted as evidence to the contrary. (That includes every book from Gabriel Garcia Marquez, Alice Munro, Zadie Smith, Hilary Mantel and Don DeLillo, to name just five novelists.) It’s in the very nature of editing that the people who do it are doing it best when we forget all about them.

But surely the most endangered editorial role is, not coincidentally, also the most ineffable. In a recent post, Kent Anderson of the Scholarly Kitchen, a blog about academic publishing, reports that scientific and scholarly journals are now spoken of as if peer-reviewing were the only significant part of their editorial process: “one of the most important roles — that of the editor-in-chief or senior editor — seems to have been lost.” Of late, some journals have even been launched without any main editor at all; editorial and advisory boards, combined with peer review, have been deemed sufficient. There is no leader to be held personally accountable for the journal’s choices, and the publication loses something else, as well: vision, character, a personality all its own.

Anderson argues that in scholarly publishing — although the same goes for journalism and book publishing — a good editor-in-chief provides much that is crucial if also intangible. An EIC can be a martinet, a glad-hander, a guru or a sort of veiled prophet, only emerging from her office to utter baffling, oracular pronouncements. The best make their readers, contributors and staff feel like they’ve been invited to the greatest party ever, with the most scintillating, intelligent, profound and witty guests. They attract commercial authors who want to rub shoulders (or bindings) with literary greats, thereby helping to fund the continuing publication of literary greats, and they deploy their charm and expertise to get the best work out of major talents. Gifted younger editors want to work with them and in turn bring in new voices.

While the personality of a journalistic publication is usually clear to its readers, most book buyers are at best dimly aware of the distinct characters of the hundreds of large and small book publishing companies out there. (This includes smaller divisions of big presses, known as imprints, that are frequently named after the individual editors who run them.) Authors, booksellers and other industry types know exactly what each house stands for, but no publisher has had the equivalent of a Steve Jobs, someone whose role is to embody a coherent identity to both insiders and the pubic at large. For decades, this state of affairs has suited everyone just fine. Now that debates over what an e-book should cost — ultimately, a quarrel over the value of the labor invested in it by everyone other than the author — have gone public, that may no longer be enough.

Further reading

Margaret Sullivan, the New York Times’ public editor, on Alessandra Stanley’s article about Shonda Rhimes

A.O. Scott for the New York Times Magazine on “The Death of Adulthood in American Culture”

HBO’s page for Martin Scorsese’s “The Fifty-Year Argument”

Kent Anderson for the Scholarly Kitchen on “The Editor — A Vital Role We Barely Talk About Anymore”

Laura Miller is a senior writer for Salon. She is the author of “The Magician’s Book: A Skeptic’s Adventures in Narnia” and has a Web site, magiciansbook.com.

http://www.salon.com/2014/10/09/the_incredible_vanishing_editor_what_we_can_learn_from_martin_scorseses_new_nyrb_documentary/?source=newsletter

Joan Didion Answers the Proust Questionnaire

by

“Misery is feeling estranged from people I love. Misery is also not working. The two seem to go together.”

In the 1880s, long before he claimed his status as one of the greatest authors of all time, teenage Marcel Proust (July 10, 1871–November 18, 1922) filled out an English-language questionnaire given to him by his friend Antoinette, the daughter of France’s then-president, as part of her “confession album” — a Victorian version of today’s popular personality tests, designed to reveal the answerer’s tastes, aspirations, and sensibility in a series of simple questions. Proust’s original manuscript, titled “by Marcel Proust himself,” wasn’t discovered until 1924, two years after his death. Decades later, the French television host Bernard Pivot, whose work inspired James Lipton’s Inside the Actor’s Studio, saw in the questionnaire an excellent lubricant for his interviews and began administering it to his guests in the 1970s and 1980s. In 1993, Vanity Fair resurrected the tradition and started publishing various public figures’ answers to the Proust Questionnaire on the last page of each issue.

In 2009, the magazine released Vanity Fair’s Proust Questionnaire: 101 Luminaries Ponder Love, Death, Happiness, and the Meaning of Life (public library) — a compendium of answers by such cultural icons as Jane Goodall, David Bowie, Allen Ginsberg, Hedy Lamarr, Gore Vidal, and Julia Child.

Unsurprisingly, some of the most wonderful answers come from 69-year-old Joan Didion — a woman who has endured more personal tragedy than most and has written about it with great dignity and grace, extracting from her experience wisdom on such subtle and monumental aspects of existence as grief, self-respect, keeping a notebook.

Portrait of Joan Didion by Robert Risko for Vanity Fair

Didion’s answers are particularly poignant for their timing — she answered The Proust Questionnaire in October of 2003, several weeks before her husband died of a heart attack while her only daughter lay comatose in the ICU; though Didion’s daughter did recover from the coma, acute pancreatitis took her life eighteen months later.

What is your greatest fear?

I have an irrational fear of snakes. When my husband and I moved to a part of Los Angeles County with many rattlesnakes, I tried to desensitize myself by driving every day to a place called Hermosa Reptile Import-Export and forcing myself to watch the anacondas. This seemed to work, but a few yeas later, when we were living in Malibu and I had a Corvette, a king snake (a “good” snake, not poisonous, by no means anaconda-like) dropped from a garage rafter into my car. My daughter, then four, brought it to show me. I am ashamed to say I ran away. I still think about what would have happened had I driven to the market and noticed my passenger, the snake, on the Pacific Coast Highway.

What is the trait you most deplore in yourself?

I find “speaking one’s mind” pretty overrated, in that it usually turns out to be a way of aggrandizing the speaker at the expense of the helpless listener.

What is your favorite journey?

A long time ago, before they showed movies on airplanes and decided to make you close the blinds, I used to love flying west and watching the country open up, the checkerboarded farms of the Midwest giving way to the vast stretches of nothing. I also loved flying over the Pole from Europe to Los Angeles during the day, when you could see ice floes and islands s in the sea change almost imperceptibly to lakes in the land. This shift in perception was very thrilling to me.

On what occasion do you lie?

I probably lie constantly, if the definition of lying includes white lies, social lies, lies to ease a situation or make someone feel better. My mother was incapable of lying. I remember her driving into a blinding storm to vote for an acquaintance in an S.P.C.A. election. “I told Dorothy I would,” she said when I tried to dissuade her. “How will Dorothy know?” I asked. “That’s not the point,” my mother said. I’m sorry to report that this was amazing to me.

What do you dislike most about your appearance?

For a while there I disliked being short, but I got used to it. Which is not to say I wouldn’t have preferred to be five-ten and get sent clothes by designers.

Which words or phrases do you most overuse?

Most people who write find themselves overusing certain words or constructions (if they worked once, they get hardwired), so much so that a real part of the exercise is getting those repetitions out.

When and where were you happiest?

Once, in a novel, Democracy, I had the main character, Inez Victor, consider this very question, which was hard for her. She drinks her coffee, she smokes a cigarette, she thinks it over, she comes to a conclusion: “In retrospect she seemed to have been most happy in borrowed houses, and at lunch. She recalled being extremely happy eating lunch by herself in a hotel room in Chicago, once when snow was drifting on the window ledges. There was a lunch in Paris that she remembered in detail: a late lunch with Harry and the twins at Pré Catelan in the rain.” These lunches and borrowed houses didn’t come from nowhere.

What talent would you most like to have?

I long to be fluent in languages other than English. I am resigned to the fact that this will not happen. A lot of things get in the way, not least a stubborn fear of losing my only real asset since childhood, the ability to put English sentences together.

If you could change one thing about yourself, what would it be?

I’m afraid that “one thing” would just lead to another thing, making this a question only the truly greedy would try to answer.

What is your most treasured possession?

I treasure things my daughter has given me, for example (I think of this because it is always on my desk), a picture book called Baby Animals and Their Mothers.

What do you regard as the lowest depth of misery?

Misery is feeling estranged from people I love. Misery is also not working. The two seem to go together.

Where would you like to live?

I want to live somewhere else every month or so. Right now I would like to be living on Kailua Beach, on the windward side of Oahu. Around November, I’m quite sure I will want to be living in Paris, preferably in the Hotel Bristol. I like hotels a lot. When we were living in houses in Los Angeles I used to make charts showing how we could save money by living in a bungalow in Bel-Air, but my husband never bought it.

What is your favorite occupation?

I like making gumbo. I like gardening. I like writing, at least when it’s going well, maybe because it seems to be exactly as tactile a thing to do as making gumbo or gardening.

What is your most marked characteristic?

If I listened to other people, I would think my most marked characteristic was being thin. What strikes me about myself, however, is no t my thinness but a certain remoteness. I tune out a lot.

Who is your favorite hero of fiction?

Axel Heyst in Joseph Conrad’s Victory has always attracted me as a character. Standing out on that dock in, I think (I may be wrong, because I have no memory), Sumatra. His great venture, the Tropical Belt Coal Company, gone to ruin behind him. And then he does something so impossibly brave that he can only be doing it because he has passed entirely beyond concern for himself.

 

http://www.brainpickings.org/2014/10/02/joan-didion-proust-questionnaire/

Wild Geese

Wild-Geese-400x400
Wild Geese
 
You do not have to be good.
You do not have to walk on your knees
for a hundred miles through the desert repenting.
You only have to let the soft animal of your body
love what it loves.
Tell me about despair, yours, and I will tell you mine.
Meanwhile the world goes on.
Meanwhile the sun and the clear pebbles of the rain
are moving across the landscapes,
over the prairies and the deep trees,
the mountains and the rivers.
Meanwhile the wild geese, high in the clean blue air,
are heading home again.
Whoever you are, no matter how lonely,
the world offers itself to your imagination,
calls to you like the wild geese, harsh and exciting
over and over announcing your place
in the family of things.
 
from Dream Work by Mary Oliver
published by Atlantic Monthly Press
© Mary Oliver
Sleeping in the Forest