Every sci-fi movie since Kubrick’s 1968 masterpiece has echoed the original in certain unavoidable ways

Kubrick’s indestructible influence: “Interstellar’’ joins the long tradition of borrowing from “2001’’

Kubrick's indestructible influence: "Interstellar’’ joins the long tradition of borrowing from "2001’’
“2001: A Space Odyssey” and “Interstellar” (Credit: Warner Bros./Salon)

When I first heard about Christopher Nolan’s new sci-fi adventure, “Interstellar,” my immediate thought was only this: Here comes the latest filmmaker to take on Stanley Kubrick’s “2001: A Space Odyssey.” Though it was released more than 40 years ago, ”2001″ remains the benchmark for the “serious” science fiction film: technical excellence married to thematic ambition, and a pervading sense of historic self-importance.

More specifically, I imagined that Nolan would join a long line of challengers to aim squarely at “2001’s” famous Star Gate sequence, where astronaut Dave Bowman (Keir Dullea) passes through a dazzling space-time light show and winds up at a waystation en route to his transformation from human being into the quasi-divine Star Child.

The Star Gate scene was developed by special effects pioneer Douglas Trumbull, who modernized an old technique known as slit scan photography (you can learn more about it here). While we’ve long since warp-drived our way beyond the sequence effects-wise (you can now do slit scan on your phone), the Star Gate’s eerie and propulsive quality is still powerful, because it functions as much more than just eye candy. It’s a set piece whose theme is the attempt to transcend set pieces — and character, and narrative and, most of all, the technical limitations of cinema itself.

In “2001,” the Star Gate scene is followed by another scene that also turns up frequently in sci-fi flicks. Bowman arrives at a series of strange rooms, designed in the style of Louis XVI (as interpreted by an alien intelligence), and he watches himself age and die before being reborn. Where is he? Another galaxy? Another dimension? Heaven? Hell? What are the mysterious monoliths that have brought him here? Why?

Let’s call this the Odd Room Scene. Pristine and uncanny, the odd room is the place at the end of the journey where truths of all sorts, profound and pretentious, clear and obscure, are at last revealed. In “The Matrix Reloaded,” for instance, Neo’s Odd Room Scene is his meeting with an insufferable talking suit called the Architect, where he learns the truth about the Matrix. Last summer’s “Snowpiercer,” about a train perpetually carrying the sole survivors of a new Ice Age around the world, follows the lower-class occupants of the tail car as they stage a revolution, fighting and hacking their way through first class toward the train’s engine, an Odd Room where our hero learns the truth about the train.



These final scenes in “2001″ still linger in the collective creative consciousness as inspiration or as crucible. The Star Gate and the Odd Room, particular manifestations of the journey and the revelation, have become two key architectural building blocks of modern sci-fi films. The lure to imitate and try to top these scenes, either separately or together, is apparently too powerful to resist.

Perhaps the most literal of the Star Gate-Odd Room imitators is Robert Zemeckis’s 1997 “Contact.” It’s a straightforward drama about humanity’s efforts to build a large wormhole machine whose plans have been sent by aliens, and the debate over which human should be the first to journey beyond the solar system. The prize falls to Jodie Foster’s agnostic astronomer Ellie Arroway. During the film’s Star Gate sequence, Foster rides a capsule through a wormhole that winds her around distant planets and through a newly forming star. Zemeckis’s knockoff is a decent roller coaster, but nothing more. Arroway is anxious as she goes through the wormhole, but still in control of herself; a deeply distressed Bowman, by contrast, is losing his mind.

Arroway’s wormhole deposits her in an Odd Room that looks to her (and us) like a beach lit by sunlight and moonlight. She is visited by a projection of her dead father, the aliens’ way of appearing to her in a comfortable guise, and she learns the stunning truth about … well, actually, she doesn’t learn much. Her father gives her a Paternal Alien Pep Talk. Yes, there is a lot of life out in the galaxy. No, you can’t hang out with us. No, we’re not going to answer any of your real questions. Just keep working hard down there on planet Earth; you’ll get up here eventually (as long as you all don’t kill each other first).

Brian De Palma tried his own version of the Odd Room at the end of 2000’s “Mission to Mars,” which culminates in a team of astronauts entering a cool, Kubrick-like room in an alien spaceship on Mars and, yes, learning the stunning truth about the origins of life on Earth. De Palma is a skilled practitioner of the mainstream Hollywood set piece, but you can feel the film working up quite a sweat trying and failing to answer “2001,” and early-century digital effects depicting red Martians are, to be charitable, somewhat dated.

But here comes “Interstellar.” This film would appear to be the best shot we’ve had in years to challenge the supremacy of the Star Gate, of “2001″ itself, as a Serious Sci-Fi Film About Serious Ideas. Christopher Nolan should be the perfect candidate to out-Star Gate the Star Gate. Kubrick machined his visuals to impossibly tight tolerances. Nolan (along with his screenwriter brother Jonathan) do much the same to their films’ narratives, manufacturing elaborately conceived contraptions. The film follows a Hail Mary pass to find a planet suitable for the human race as the last crops on earth begin to die out. Matthew McConaughey plays an astronaut tasked with piloting a starship through a wormhole, into another galaxy and onto a potentially habitable planet. “Interstellar” promises a straight-ahead technological realism as well as a sense of conscious “We’re pushing the envelope” ambition. (Hey, even Neil deGrasse Tyson vouches for the film’s science bonafides.) The possibilities and ambiguities of time, one of Nolan’s consistent concerns as a storyteller, is meant, I think, to be the trump card that takes “Interstellar” past “2001.”

But the film is not about fealty to, or the realistic depiction of, relativity theory. It’s about “2001.” And before it can try to usurp the throne, “Interstellar” must first kiss the ring. (And if you haven’t seen “Interstellar” yet, you might want to stop reading now.) So we get the seemingly rational crewmember who proves to be homicidal. The dangerous attempt to manually enter a spaceship. More brazenly, there’s a set piece of one ship docking with another. In “2001,” the stately docking of a spaceship with a wheel-shaped space station, turning gently above the Earth to the strains of the Blue Danube was, quite literally, a waltz, a graceful celestial courtship. It clued us in early that the machines in “2001″ would prove more lively, more human, than the humans. “Interstellar” assays the same moment, only on steroids. It turns that waltz, so rich in subtext, into a violent, vertiginous fandango as a shuttle tries to dock with a mothership that’s pirouetting out of control.

Finally, after a teasing jaunt through a wormhole earlier in the movie, we come to “Interstellar’s” Star Gate moment, as Cooper plummets into a black hole and ultimately into a library-like Odd Room that M.C. Escher might have fancied. It’s visually impressive for a moment, but its imprint quickly fades.

It’s too bad.” Interstellar” wants the stern grandeur of “2001″ and the soft-hearted empathy of Steven Spielberg, but in most respects achieves neither. Visually only a few images impress themselves in your brain — Nolan, as is often the case in his movies, is more successful designing and calibrating his story than at creating visuals worthy of his ambition. Yet the film doesn’t manage the emotional dynamics, either. It’s not for lack of trying. The Nolan brothers are rigorous scenarists, and the concept of dual father-daughter bonds being tested and reaffirmed across space-time is strong enough on the drawing board. (Presumably, familial love is sturdier than romantic love, though the film makes a half-hearted stab at the latter.)

For those with a less sentimental bent, the thematic insistence on the primacy of love might seem hokey, but it’s one way the film tries to advance beyond the chilly humanism of Kubrick toward something more warm-blooded. Besides, when measured against the stupefying vastness of the universe, what other human enterprise besides love really matters? The scale of the universe and its utter silence is almost beyond human concern, anyway.

So I don’t fault a film that suggests that it’s love more than space-age alloys and algorithms that can overcome the bounds of space and time. But the big ideas Nolan is playing with are undercut by too much exposition about what they mean. The final scene between Cooper and his elderly daughter — the triumphant, life-affirming emotional home run — is played all wrong, curt and businesslike. It’s a moment Spielberg would have handled with more aplomb; he would have had us teary-eyed, for sure, even those who might feel angry at having their heartstrings yanked so hard. This is more like having a filmmaker give a lecture on how to pull at the heartstrings without actually doing it.

Look, pulling off these Star Gate-like scenes requires an almost impossible balance. The built-in expectations in the structure of the story itself are unwieldy enough, without the association to one of science fiction’s most enduring scenes. You can make the transcendent completely abstract, like poetry, a string of visual and aural sensations, and hope viewers are in the right space to have their minds blown, but you run the risk of copping out with deliberate obfuscation. (We can level this charge at the Star Gate sequence itself.)

But it’s easy to press too far the other way — to personify the higher power or the larger force at the end of these journeys with a too literal explanation that leaves us underwhelmed. I suppose what we yearn for is just a tiny revelation, one that honors our desire for awe, preserves a larger mystery, but is not entirely inaccessible. It’s a tiny taste of the sublime. There’s an imagined pinpoint here where we would dream of transcendence as a paradox, as having God-like perception and yet still remaining human, perhaps only for a moment before crossing into something new. For viewers, though, the Star Gate scenes ultimately play on our side of that crossroads: To be human is to steal a glimpse of the transcendent, to touch it, without transcending.

While Kubrick didn’t have modern digital effects to craft his visuals with, in retrospect he had the easier time of it. It’s increasingly difficult these days to really blow an audience’s minds. We’ve seen too much. We know too much. The legitimate pleasure we can take in knowledge, in our ability to decode an ever-more-complex array of allusions and references, may not be as pleasurable or meaningful as truly seeing something beyond what we think we know.

Maybe the most successful challenger to Kubrick was Darren Aronofsky and his 2006 film “The Fountain.” The film, a meditation on mortality and immortality, plays out in three thematically-linked stories: A conquistador (Hugh Jackman) searches the new world for the biblical Tree of Life; a scientist (Jackman again) tries to save his cancer-stricken wife (Rachel Weisz), and a shaven-headed, lotus-sitting traveler (Jackman once more) journeys to a distant nebula. It’s the latter that bears the unique “2001″ imprint of journey and revelation: Jackman travels in a bubble containing the Tree of Life, through a milky and golden cosmicscape en route to his death and rebirth. It’s the Star Gate and the Odd Room all in one. Visually, Aronofsky eschewed computer-generated effects for a more organic approach that leans on fluid dynamics. I won’t tell you the film is a masterpiece — its Grand Unifying ending is more than a little inscrutable; again, pulling this stuff of is a real tightrope — but the visuals are wondrous and unsettling, perhaps the closest realization since the original of what the Star Gate sequence is designed to evoke.

Having said that, though, it may be time to turn away from the Star Gate in our quest for the mind-blowing sci-fi cinematic sequence. Filmmakers have thus far tried to imagine something like it, only better, and have mostly failed. It’s harder to imagine something beyond it, something unimaginable. Maybe future films should not be quite so literal in their chasing of those transcendent moments. This might challenge a new generation of filmmakers while also allowing the Star Gate, and “2001″ itself, to lie fallow for awhile, so we can return to it one day with fresh eyes.

It is, after all, when we least suspect it that a story may find a way past our jaded eyes and show us a glimpse of something that really does stir a moment of profound connection. There is one achingly brief moment in “Interstellar” that accomplishes this: Nolan composes a magnificent shot of a small starship, seen from a great distance gliding past Saturn’s awesome rings. The ship glitters in a gentle rhythm as it catches light of the Sun. It’s a throwaway, a transitional moment between one scene and another, merely meant to establish where we are. But its very simplicity and beauty, the power of its scale, invites us for a moment to experience the scale of the unknown and to appreciate our efforts to find a place in it, or beyond it.

http://www.salon.com/2014/11/22/kubricks_indestructible_influence_interstellar_joins_the_long_tradition_of_borrowing_from_2001/?source=newsletter

The incredible vanishing editor: What we can learn from Martin Scorsese’s new documentary

They’re invisible until something goes wrong, as in Alessandra-Stanley-gate. But we need editors now more than ever

 

The incredible vanishing editor: What we can learn from Martin Scorsese's new documentary
Hugh Eakin and Robert Silvers in “The 50 Year Argument” (Credit: HBO/Brigitte Lacombe)

Recently, the author Andrew Solomon, while describing the Amazon reader reviews for his monumental book, “Far From the Tree: Parents, Children and the Search for Identity,” recounted that one reviewer objected to having to pay more that $9.99 for something that consisted of nothing more than electronic bits. “What about the 11 years of my life it took to write it?” Solomon asked plaintively. “What about the months my editor spent working on it?”

That’s a particularly egregious example of how writing remains a form of invisible — or at least translucent — labor. Solomon’s research for “Far From the Tree” encompassed, among other things, interviews with over 300 families; you can see every hour he spent on the book right there on the page. But no matter how infatuated our culture has become with the digital, we still have a hard time appreciating the value of immaterial creations. Consumers who demand that the price of e-books be slashed to less than half the hardcover list price reveal a belief that the work and expertise of a writer are worth less than a handful of paper and cardboard.

And, oh, that poor editor! (“It’s Nan Graham,” Solomon explained in an email. “And I value her more than gold and silver and precious stones.”) If most readers do grudgingly admit that someone must write the books they purchase, they are less willing to concede that the vocation of helping to bring someone else’s writing into the world has much merit. “You are uninformed in that technology is making many of the functions of editor and publisher obsolete,” one commenter posted to a New York Times story about the publishing industry. “Writers should need little to no editing when computers check spelling, etc.”

Even readers who claim to value non-automated editing have little sense of a editor’s actual responsibilities. The familiar grouse that “no one edits anymore” is usually followed by lamentations over the typos, grammatical errors and misspellings someone has found in traditionally published works. But correcting that kind of micro-mistake is the job of a copyeditor (or in some cases a proofreader), not the editor. So if the editor is not in charge of fixing “spelling, etc,” then what does an editor do?



There’s an expression (taken from “Hamlet”) most often used to refer to a good rule that is frequently broken: honored in the breach. But the phrase might be better deployed to describe a mission whose value we appreciate most when it fails. Like a counterterrorism operative, a good editor had done her best work when you don’t realize she’s done anything at all.

It is only when editors fail — as was the case with, say, the three New York Times editors who gave a pass to Alessandra Stanley’s now-infamous “Angry Black Woman” article on Shonda Rhimes — that their importance becomes evident. The same could be said for A.O. Scott’s much-discussed New York Times Magazine essay, “The Death of Adulthood in American Culture,” which was much-discussed in large part because the through lines of Scott’s argument were muddled by the deployment of too many examples that didn’t really fit his thesis. In both cases, a more culturally knowledgeable editor would have invisibly fixed the problem before the piece ever saw print.

Martin Scorsese’s new film, “The Fifty-Year Argument” (currently being shown on HBO) is a rare tribute to another part of the editor’s job: Making writing happen in the first place. A documentary about the history of the venerable New York Review of Books, the film mostly focuses on the Review’s stellar contributors: Joan Didion, Zoe Heller, Daryl Pinckney, Daniel Mendelsohn, and many more. But interspersed with those interviews are quiet scenes that show the NYRB’s 85-year-old editor-in-chief, Robert Silvers, calling up writers and proposing assignments. It’s not great cinema, but it’s far more relevant to the Review’s stature than the more exciting archival footage of Norman Mailer shouting at the audience during a 1971 panel discussion on feminism.

“I needed him so much,” Didion tells the camera, recounting how Silvers had persuaded her to cover several Democratic and Republican conventions “not so much to walk me through it as to give me the confidence I could walk myself through it. … Bob involved me in writing about stuff I had no interest in.” As perverse as this strategy might sound, it does often work. That has been my own experience, particularly with Salon’s founding editor, David Talbot, who was always persuading me to tackle subjects (Michael Jackson) or to cover stories (a murder trial) that I considered off my beat; as a result I learned more than I would have if left to my own devices. Whatever the merits of my own stuff, the culture would be a lot poorer without Didion’s reporting on domestic politics. It shone in large part because, as someone who considered herself uninterested in the topic, she was able to bring a fresh, smart perspective to bear.

While this happens less often with books (it’s difficult to convince anyone to spend that much time working on a subject not of their own choosing), it does occur, albeit sometimes indirectly. James Baldwin’s “The Fire Next Time,” for example, although eventually published in book form, originated as an assignment to write about the Nation of Islam for Norman Podhoretz at Commentary. Editors also convince writers to bail on misbegotten projects and switch to more promising ones, or persuade them to reshape and recast books that aren’t working. Daniel Handler was struggling with a “mock-gothic novel for adults called ‘A Series of Unfortunate Events,’” when his agent sent his first and yet-unsold adult manuscript (“The Basic Eight” — highly recommended) to a children’s book editor, Susan Rich. She suggested he try writing for a younger audience, and Lemony Snicket was born.

Literary history celebrates many such cases, as well as instances when aggressive editorial “fat trimming” secured an author’s fame. The hierophant/editor Gordon Lish pared Raymond Carver’s short stories down to the bone, and even if Carver was not happy with this transformation, Lish’s scalpel played a major role in Carver’s exultation as the king of minimalism. Ezra Pound, in an unofficial capacity, performed the same wizardry on T.S. Eliot’s “The Waste Land.”

Maxwell Perkins, the editor who fought for (and with) such writers as Ernest Hemingway, F. Scott Fitzgerald and Thomas Wolfe, is regularly held up as the pinnacle of the editorial profession in years past. This praise is inevitably followed by the complaint that no one even tries to follow his example anymore. But then again, how would we know? It’s not a profession that attracts many spotlight hogs, and besides, the mystique of the solitary genius is so much more romantic than the reality that most good books are the result of a collaboration. I recently found myself sitting with three very talented, highly acclaimed novelists, so I asked them about their editors. “It amazes me,” said one. “They work so hard on our books, and then we get all the credit.” The other two writers nodded in agreement.

If editors like Perkins seem rare today, well, they were probably rare in the 1920s, as well. Not every writer gets the Maxwell Perkins treatment, but then again not every writer is F. Scott Fitzgerald. Furthermore, if every bad book can be held up as evidence that editors are falling down on their jobs, then every good book ought to be admitted as evidence to the contrary. (That includes every book from Gabriel Garcia Marquez, Alice Munro, Zadie Smith, Hilary Mantel and Don DeLillo, to name just five novelists.) It’s in the very nature of editing that the people who do it are doing it best when we forget all about them.

But surely the most endangered editorial role is, not coincidentally, also the most ineffable. In a recent post, Kent Anderson of the Scholarly Kitchen, a blog about academic publishing, reports that scientific and scholarly journals are now spoken of as if peer-reviewing were the only significant part of their editorial process: “one of the most important roles — that of the editor-in-chief or senior editor — seems to have been lost.” Of late, some journals have even been launched without any main editor at all; editorial and advisory boards, combined with peer review, have been deemed sufficient. There is no leader to be held personally accountable for the journal’s choices, and the publication loses something else, as well: vision, character, a personality all its own.

Anderson argues that in scholarly publishing — although the same goes for journalism and book publishing — a good editor-in-chief provides much that is crucial if also intangible. An EIC can be a martinet, a glad-hander, a guru or a sort of veiled prophet, only emerging from her office to utter baffling, oracular pronouncements. The best make their readers, contributors and staff feel like they’ve been invited to the greatest party ever, with the most scintillating, intelligent, profound and witty guests. They attract commercial authors who want to rub shoulders (or bindings) with literary greats, thereby helping to fund the continuing publication of literary greats, and they deploy their charm and expertise to get the best work out of major talents. Gifted younger editors want to work with them and in turn bring in new voices.

While the personality of a journalistic publication is usually clear to its readers, most book buyers are at best dimly aware of the distinct characters of the hundreds of large and small book publishing companies out there. (This includes smaller divisions of big presses, known as imprints, that are frequently named after the individual editors who run them.) Authors, booksellers and other industry types know exactly what each house stands for, but no publisher has had the equivalent of a Steve Jobs, someone whose role is to embody a coherent identity to both insiders and the pubic at large. For decades, this state of affairs has suited everyone just fine. Now that debates over what an e-book should cost — ultimately, a quarrel over the value of the labor invested in it by everyone other than the author — have gone public, that may no longer be enough.

Further reading

Margaret Sullivan, the New York Times’ public editor, on Alessandra Stanley’s article about Shonda Rhimes

A.O. Scott for the New York Times Magazine on “The Death of Adulthood in American Culture”

HBO’s page for Martin Scorsese’s “The Fifty-Year Argument”

Kent Anderson for the Scholarly Kitchen on “The Editor — A Vital Role We Barely Talk About Anymore”

Laura Miller is a senior writer for Salon. She is the author of “The Magician’s Book: A Skeptic’s Adventures in Narnia” and has a Web site, magiciansbook.com.

http://www.salon.com/2014/10/09/the_incredible_vanishing_editor_what_we_can_learn_from_martin_scorseses_new_nyrb_documentary/?source=newsletter

Heinrich Himmler, family man: Why “The Decent One” is the most haunting documentary I’ve ever seen

A devastating doc drawn from a trove of private papers reveals the Nazi war criminal as husband and dad

Heinrich Himmler, family man: Why "The Decent One" is the most haunting documentary I've ever seen

Heinrich Himmler and his family, in an undated photograph from “The Decent One.”

Much of the history of human thought has revolved around our efforts to understand the nature of evil, which have never yielded anything like a satisfactory result. We are fascinated by serial killers and murderous dictators, torn between the obvious fact that they are human beings like ourselves and the conviction that in some fundamental way they must be different. Fictional embodiments of evil, like J.R.R. Tolkien’s Sauron or J.K. Rowling’s Lord Voldemort, are essentially reassuring because they are nothing except evil; both are built on the Satanic pattern, meaning they were once pure, but have left their uncorrupted selves far behind. Even the perversely noble Satan of “Paradise Lost,” after nine days of torture by fire, knows only one purpose: “immortal hate” and “eternal War irreconcileable” against the power of heaven.

Set against this conception of inhuman monstrosity we have the countervailing evidence, famously marshaled by Hannah Arendt under the controversial term “the banality of evil.” In latter days, many scholars and Holocaust survivors have contended that Arendt misapplied this term to Adolf Eichmann. (Most notably, the late Benjamin Murmelstein, who knew Eichmann well, told “Shoah” director Claude Lanzmann that Eichmann was a sadist and a zealous anti-Semite.) But the dispute over Eichmann’s personality oversimplifies the profound philosophical insight that lies behind Arendt’s phrase, which speaks to the fact that people who do terrible things, and who hold beliefs most of us find noxious and inexplicable, often appear to be entirely normal in other areas of life. Those Technicolor films of Hitler snuggling his beloved German shepherd (shot by Eva Braun) have always struck me as more disturbing than his long, ranting speeches. Der Führer looks like any other high-level functionary of the business or political realms, seeking an outlet for repressed emotion. He doesn’t look good or evil; he looks human.



That brings us, however reluctantly, to the subject of Heinrich Himmler, the focus of Vanessa Lapa’s devastating collage documentary “The Decent One,” and a highly viable candidate for the title of Worst Human Being Ever. As leader of the SS and head of the Gestapo, Himmler almost certainly knew more about the details of the Holocaust than Hitler did. As the private letters and documents that form the principal text of “The Decent One” make clear, he was a true believer in Nazi master-race ideology, who enthusiastically embraced the project of exterminating all the Jews in the Third Reich’s zone of control. Yet any attempt to describe Himmler as a monster founders on the mountain of ordinary domestic details. His letters home to his wife, Marga, are affectionate and playful, laden with the role-playing and in-jokes that are the inner substance of any marriage. Later on, of course, Himmler writes similar letters to the mistress with whom he had two illegitimate children – but old-fashioned bourgeois adultery is the deed of a man, not a monster.

While the cumulative effect of “The Decent One” is difficult to summarize, and some viewers have felt alienated by Lapa’s formal and structural choices, I found this film to be one of the most profoundly disturbing cinematic experiences in a life full of them. It has the slow-building nightmare force of one of Michael Haneke’s films, which have their roots (it hardly needs to be said) in the same 20th-century European darkness. Even those steeped in World War II and Holocaust history will learn things from “The Decent One,” and none of them are likely to make you sleep better at night. Himmler is not made less repulsive in this human-scale portrait; indeed, he seems more despicable than ever. But Lapa constantly confronts us with the fact that we cannot easily thrust him away as a psychotic aberration, not when he seems like such a recognizable type.

Himmler is the guy who more or less successfully compartmentalizes his family life and the unlovely details of a demanding professional career. He reads an adoring letter from his daughter, Gudrun (known as Püppi) – who at one point seems mystified by the fact that her science teacher caught her cheating on a test but did not punish her – and later that day or the next reads a secret memo from an underling discussing technical problems with the “death wagons,” the rolling gas chambers then being used to asphyxiate Jews and Communists along the Eastern Front. (I don’t want to tell you what the memo was about; I wish I didn’t know.) I’m not suggesting that oil-company executives, corporate lobbyists or predatory venture capitalists are guilty of crimes on anywhere near Himmler’s scale, but the psychological mechanism involved is about the same.

“The Decent One” is based on a trove of recently revealed material whose provenance is somewhat mysterious, although as far as I know its authenticity is not in question. Lapa’s film suggests that the Himmler family’s private archive was stolen by American soldiers, and then sold on the black market, after Himmler’s home in Bavaria was seized in the spring of 1945. The collection, which includes Himmler and Marga’s letters, Püppi’s childhood diary, many other documents and numerous family photographs, apparently found its way to Israel and was held there for decades, until Lapa’s father purchased it privately a few years ago. There’s something more than a little unsavory about that history, as if the criminality of the papers’ former owners had leached into them somehow and infected future generations. I suppose it all fits together.

In the early stages of “The Decent One,” which sketch out a brief portrait of Himmler’s unexceptional childhood in a privileged Bavarian Catholic family, Lapa stitches together archival footage from many sources, most of which is simply meant to illustrate provincial German life early in the 20th century. Almost none of this early material directly pertains to Himmler (and she never claims it does), but some viewers may be confused about that. Throughout the film, she often adds an artificial soundtrack – gunshots, children’s cries, the roar of motorcycle engines – to original newsreel or archival footage that is almost certainly silent. I noticed this and wondered about it and then pretty much stopped thinking about it, but on balance it feels like a mistake, an attempt to add aesthetic immediacy to material that does not require it.

Like his future boss, young Heinrich Himmler was nothing like the ideal of blond, muscular Aryan masculinity. Childhood photos reveal a spindly, bespectacled kid, who was often ill and only moderately successful at school. There isn’t enough evidence to know whether he was bullied or abused, but Himmler certainly didn’t have the outdoorsy, athletic boyhood full of hearty camaraderie that he later prescribed for aspiring SS members. He was rejected for fraternity membership at the University of Munich, where Himmler’s diary reveals that the brothers once discussed “homosexuality and the Jewish question,” apparently a first reference to a topic that would dominate his adult life. It’s way too easy to sum up the Nazi leaders as vindictive little men trying to make up for a perceived deficit of masculinity, since that describes many people who don’t become mass murderers. (Or who just don’t get the chance.) At any rate, in Himmler’s case as in Hitler’s, the shoe seems to fit.

As a young man in the mid-1920s, Himmler joined the Nazi Party and also married Marga, a woman seven years older than he who at first seemed the dominant partner. This movie isn’t a history of the former relationship, or of the role Himmler played as Hitler’s most loyal devotee and lieutenant, and you’ll have to get that elsewhere. Marga professes no interest in her husband’s political leanings, at one point needling him: “Why do you go to a Hitler rally when you know what he’s going to say?” She ultimately joined the party too, but with that mixture of loyalty, willful blindness and self-delusion typical of Nazi wives, it’s not clear how much she ever understood about what her husband did, or his enormous personal responsibility for the Final Solution and its implementation. There’s some casual anti-Semitism in their early letters, including unsavory banter about a Jewish moneylender, but nothing remotely unusual for that time and place.

But the self-delusion was not Marga’s alone. Lapa’s title is drawn from an apparently preposterous remark Himmler made, after the war had turned against Germany and the end was in sight: “We can have but one desire as to what is said about us: these German officers, these German soldiers – they were decent.” At another point, he writes fervently about how German troops do not mistreat animals or inflict unnecessary suffering – adding, almost in parenthesis, that this cannot apply to Jews and other “subhumans,” who are below the status of animals. Part of the decency of German soldiers, in Himmler’s view, lay in the fact that many of them suffered psychological damage after committing war crimes. He repeatedly expressed concern about this problem, receiving a secret memo about a prominent general’s mental breakdown and urging that troops on the Eastern front be given free evenings full of drinking, music, theater and wholesome German culture.

There’s no indication that Himmler thought he was being a horrifying hypocrite when he recommended cabaret entertainment as therapy for mass murderers. Indeed, those were the worst parts of the film for me exactly because you can feel Himmler struggling with himself, just a little, and trying to find off the terrible cognitive dissonance of what he has done and what he claims to be. Whether she intends this or not, Lapa forces us to confront apparently incompatible truths: Himmler was a monstrous war criminal, one of history’s very worst, and he was also an apparently normal person who somehow screwed himself up to do these terrible things and suffered for it. I’m not saying he suffered enough, God knows, or that his suffering makes up for anything. But this realization forces us to recognize ourselves in him, just a little, and to recognize the small bad things each of us has done as the seeds of very big bad things.

”The Decent One” is now playing at Film Forum in New York, and opens Oct. 10 at the Music Hall in Los Angeles, with national release to follow.

http://www.salon.com/2014/10/02/heinrich_himmler_family_man_why_the_decent_one_is_the_most_haunting_documentary_ive_ever_seen/?source=newsletter

Terry Gilliam: Hollywood is just “gray, frightened people” holding on for dear life

The godfather of dystopian cinema on the death of Hollywood, why he gave up U.S. citizenship and his new movie

 

Terry Gilliam: Hollywood is just “gray, frightened people” holding on for dear life
Terry Gilliam, on the set of “The Zero Theorem” (Credit: Amplify Releasing)

If you want to illustrate the old adage about a prophet who is without honor in his own country, look to Terry Gilliam. Mind you, I guess America isn’t even Gilliam’s country anymore, and neither is Hollywood. The onetime Monty Python member and director of “Time Bandits,” “Brazil” and “12 Monkeys,” although he was born in Minneapolis and spent his teen years in Los Angeles, has lived as an expatriate for many years and renounced his United States citizenship in 2006. (That was partly in protest of George W. Bush and partly, Gilliam has said, to shield his wife and children from tax liability.)

Gilliam didn’t need to repudiate his relationship with the mainstream film industry, which had pretty much turned its back on him after the commercial failure of “Fear and Loathing in Las Vegas” in 1998 – a movie that looks, in retrospect, like the ultimate distillation of his grotesque and visionary directorial style. Gilliam pioneered the blend of fantasy and dystopian science fiction in the days before CGI, when those things seemed like geeky and bizarre niche interests. Go back and look at the remarkable special effects in another underappreciated box-office flop, “The Adventures of Baron Munchausen,” made in 1988. He was just a few years too early, but his influence is everywhere in contemporary cinema and culture, even as his later career has been a remarkable parade of near-misses and not-quites. Not for nothing is the aging, threadbare rebel leader played by John Hurt in Bong Joon-ho’s “Snowpiercer” named Gilliam!



Gilliam was supposedly J.K. Rowling’s first choice for “Harry Potter and the Sorcerer’s Stone,” and Alan Moore’s first choice for “Watchmen,” and can you even imagine how weird and great those movies would have been? His list of never-launched projects includes an adaptation of “A Tale of Two Cities” with Mel Gibson and an adaptation of Mark Twain’s “A Connecticut Yankee in King Arthur’s Court.” He still hopes to make “The Man Who Killed Don Quixote” with Johnny Depp, a production that legendarily imploded 15 years ago. His new movie, “The Zero Theorem,” starring Christoph Waltz as an isolated computer programmer searching for the meaning of life in an overloaded info-society not far removed from our own, has been in the works for at least six years. It was originally cast with Ewan McGregor in the lead, and then Billy Bob Thornton (alongside Jessica Biel and Al Pacino), in a version that was all set to go in 2009, before Gilliam turned his attention to finding a way to finish yet another haunted film, “The Imaginarium of Doctor Parnassus,” after the death of its star, Heath Ledger.

Gilliam has described “The Zero Theorem” as the completion of an “Orwellian triptych” that began with his dystopian masterpiece “Brazil” in 1985 and continued with “12 Monkeys” a decade later. I won’t argue that this film lives up to the earlier two, but taken on its own terms it’s a work of wicked wit and imagination, shot by cinematographer Nicola Pecorini in classic Gilliam style: wide lenses, deep focus, and the frame overloaded with grotesque detail. Waltz gives one of his best performances as a barely functional nerd-genius called Qohen, who refers to himself in the first-person plural and works on disturbing data-crunching tasks for an enormous company that seems to have devoured the state. There are all sorts of delicious supporting turns, including David Thewlis as the cheerful boss determined to draw Qohen out of his shell, Tilda Swinton as an online therapy-bot (who performs a disastrously awful rap song) and Matt Damon as the back-room kingpin known only as Management.

As veteran Gillian-watchers will already suspect, “The Zero Theorem” is more tragedy than farce, despite all its levels of technological and social satire. It’s the story of a man who is highly skilled in some ways but finds himself out of step with the world and with his time, desperate to connect with others but finally unable to do so. Gilliam didn’t write this script (which is by Pat Rushin, a creative writing instructor), but immediately agrees that he responded to it on a personal level. He wasn’t sure whether he’d be able to visit New York for the “Zero Theorem” – as an ex-citizen, he is limited to 29 days a year in the U.S., and needs to parcel them out carefully – so he called me from his home office in London.

This must be an exhausting ritual, being grilled by Americans on the phone – and about the past, in many cases.

It’s fine. After becoming depressed like Qohen for being alone as much as I am, I’m happy to talk to people.

It’s funny that you say that. Maybe this is dime store psychology, but I was irresistibly drawn to that interpretation. The curmudgeonly guy who hates everybody, who’s locked in his house trying to solve a problem that cannot be solved. How much is that you?

That’s 100 percent me [laughs]. No, I identify with him so much. I thought Christoph made him an interesting character despite his behavior. I think I’m getting more and more curmudgeonly as the years pass, because you get angry. You look at healthy young people and realize your body doesn’t do that any more, so you get even more angry.

Well, and then there’s your relationship with the film industry, which was maybe never so terribly warm and fuzzy. Is that that you have changed or that the nature of the mainstream film industry has changed? Or have the two of you just sort of drifted further apart?

I think we’ve both changed and probably drifted apart for that reason, even more. In Hollywood, at least when I was making films there, there were people in the studios that actually had personalities. You could distinguish one from the other. And now, I don’t see that at all. It’s just gray, frightened people holding on without any sense of “let’s try something here, let’s do something different.” But to be fair, I haven’t been talking to anybody from the studios in the last few years. But the films that Hollywood is making now, it’s clear what’s going on. The big tent-pole pictures are just like the last tent-pole pictures. Hopefully one of them will work and keep the studio going. It’s become … it’s a reflection of the real world, where the rich get richer and the poor get poorer and the middle class get squeezed out completely. So the kind of films I make need more money than the very simple films. Hollywood doesn’t deal with those budgets anymore; they don’t exist.

You can’t make the film in your house for $50,000. But they’re also not going to give you $100 million. You’re in a mid-budget area they don’t like, right?

Yeah. It’s terrible. I’m not alone in the mid-budget area that’s being pushed out of work. It’s a great sadness because there are many small films that can be wonderful, or you get huge $100 million-plus budgets and they’re all the same film, basically, or very similar. It’s just not as interesting as it used to be. The choice out there is less interesting. The real problem now is that when you make a small film, to get the money to promote it is almost impossible. You can’t complete with a $70-80 million budget the studios have. So it becomes less and less interesting. That’s why, in a sense, the most interesting work at the moment, as any creative person, knows is coming out of television in America now, not coming out of the studios.

The studios have two niches, and the problem is that you don’t fit in either one of them. You’re not going to do a “Transformers” movie for $250 million. And they think you’re not the right person to do the movie that maybe costs $40 million and is aimed at the Oscars, or is a prestige literary adaptation or something. They don’t trust you with those, right?

I wouldn’t trust me with them either. [Laughs.] I just want to do what I do. And I don’t even get scripts from Hollywood. I don’t even ask for scripts anymore because I kind of know what they’re going to be. They don’t interest me, so I’ve chosen to wander in the wilderness for another 40 years. We’ll see how it goes.

Going back to the character that Christoph plays in this film, there’s so much going on on the surface, but what really got to me was the tremendous sadness. This person who has a creative drive, a creative urge, and is in a situation where there’s no way for him to fulfill that. That struck me as a situation of extreme pathos.

Well that’s how I see the film. It’s very funny but it’s basically tragedy. It’s very sad. It does move me. You can sort of do the parallels between me and that guy, but at heart that’s not really what it is. Not getting to do what you want to do is one thing, but his problem is that he lets life and relationships fall apart because he can’t grasp them. He’s so damaged — I think the scene when Bainsley [a femme fatale played by French actress Mélanie Thierry] leaves and offers for him to come along, he can’t do it. For me that’s the core scene of the film. What happened to this guy? So in the end, I had to leave him with some kind of sense of dignity and a kind of peace. It may only exist in a virtual world and at least he can let the sun set. He can control that much. I mean, when I make a film, there’s always a big autobiographical element in it, that’s the only way I know how to make films. I have to identify with the character in one way or another. And this one, in retrospect, ended up being quite interesting because when I started it I didn’t think it was going to be that film exactly, but that’s what it became.

Your portrayal of the world is so interesting, people will inevitably look back to your earlier films and other people’s. I felt like you were referencing “Blade Runner,” which came out just before “Brazil,” more directly than you ever have before. But the important thing to me was that this portrait of the informational clutter of the world is almost not a satire or an exaggeration. It’s maybe a tiny bit exaggerated, but it’s almost a portrait of the real world.

Yes. Thank you for that. People talk about it in some sort of future, dystopic view. No! It’s exactly what’s going on right now as far as I’m concerned. [Laughs.] When I walk out onto the streets of London, I’m bombarded exactly like Qohen is at the beginning of the film. It’s endless, it seems to me. And that’s why I sort of built that world around him. Everything in the world out there is colorful and people seem to be having a good time and shopping is bubbling away and things are being offered to you left, right and center. The workplace is a colorful place with people zipping around having a great time. There’s only one bit of darkness and grayness in the thing and that’s Qohen. And that’s what intrigued me about him. He’s very much like a monk. He’s in a burned-out church and it’s a church that has no meaning anymore. That particular construct of life has passed him by. And yet, that’s why I love when another character tells him, “Nonetheless, you’re a man of faith. And that is the very thing that has made you not live your life.”

I have to recommend to you a documentary called “Web Junkie” that you probably haven’t heard about, which is about young people in China being sent to re-education camps to cure them of their supposed Internet addiction. This is a movie that you could have invented, that happens to be a true story. It’s like a corollary to or the dark side of this film. You use technology in an interesting way in this film: You have tons of digital effects in here but the movie is also about the social effects of technology. What is your personal attitude about the way that it’s changed our lives?

TG: It terrifies me because I’m a junkie; I’m an addict. I’m sitting here right now in front of a high-definition computer screen. It’s consumed far, far too much of my life. It’s very seductive and that’s what makes me crazy. The days go by and I’m finding myself still sitting in front of that screen when I should be out doing something physical but it’s easier to sit there and poke around on the Web. That’s one side of it. As far as tweeting and all of that, I don’t. And I find it appalling that people cannot just get on and experience the moment. They seem to have to comment on the moment all the time. Andy Warhol’s 15 minutes of fame is 15 megabytes of fame. Everybody, it’s all about them. The world becomes background to them as opposed to them relating to the world.

We did a thing in the Python reunion show at the end, a meet-and-greet where people paid extra money to meet and greet us. And there was a barrier that kept us from touching. And people at the beginning said “Everybody does a selfie now.” So we had them all turn their backs to us and hold the phone up and we were in the background smiling while they were in the foreground of the picture. It’s just — this is crazy. You go to a rock concert and before the first song is finished, the tweets are coming through. It makes me crazy because people are not relating to the real world anymore. That’s very worrisome. Hunter Thompson predicted America would soon be a nation of panicky sheep, and I think it’s adding to the problem.

You made “Time Bandits,” “Brazil” and “Baron Munchausen” in the ‘80s, long before the rise of CGI, and now we find ourselves in an era where every Hollywood movie is either a dystopian science fiction vision or a fantasy film. Do you feel like you explored the territory first and nobody remembers that?

Well, I don’t know. I think, yeah. I keep thinking I would like to have a chance at those kinds of things, because that’s what I’ve always wanted to make and that’s what I did back then. I just don’t know what they are anymore. They’re films — fantasy without substance and sci-fi dystopia without intelligence. I don’t know really what to comment except that they all seem to be clones of each other. And people are so happy to go back and see the same thing again and again and again. And that, to me, makes me sad about the state of the world. We want reassurance now rather than being challenged, and that’s sad.

This is what I always say when people ask me about the difference between watching 300 films a year and watching just a few, like ordinary moviegoers. Ordinary people seem largely OK, or at least historically OK, with seeing the same films over and over again, for whatever reason. Maybe with technical innovations or improvements in execution …

Technically they’re brilliantly done. They’re beautiful things but there’s nothing in them. There’s nothing new. Nothing to make you think or look at the world in a different way. It’s just the same thing going on and on and on. It really is bread and circuses these days. It may be a sign of people’s impotence, that they can’t really change anything so let’s keep going back and have that McDonald’s burger because we know exactly what we’re about to get and let’s watch another Marvel Comics film because we know exactly what we’re going to get.

You have, however, adjusted to the era of doing the effects digitally. Obviously, when you did “Munchausen” or “Time Bandits,” that was all either physical effects or camerawork.

It hasn’t changed anything really, for me. Because I’m sort of forced into the world of “You have to do it the cheapest way.” And fortunately or unfortunately, CGI is cheaper than doing old-fashioned effects. So I end up doing it that way. I’ve always used digital effects. In “12 Monkeys” there’s a lot of stuff in there. I just don’t want — I want it to be a minor part of the filmmaking process, to deal with the things I can’t quite do. What’s happening now is because you can do it with CGI, anything you want, well, that doesn’t mean you should do anything you want. I like the restrictions. Maybe it’s a way of controlling myself, having to work with a small budget. This film is about $8.5 million, but it looks a lot more expensive than that. What you do is go to Bucharest, where people get paid a fraction of what they get paid in London. You call some friends up and they work for scale rather than what they normally get paid when they work for Hollywood. So I’m happy when Hollywood pays some of my friends a lot of money so they can come work for me for next to nothing.

Actors still want to work with you, right? That’s the good part.

Yeah, that’s nice. I think because so many of them are looking for interesting parts and they’re born with the stuff — they’ll do it because it pays good money, it pays the mortgage and buys time to do more interesting things. I think that’s kind of it. I suppose what actors like about me is they know I’ll give them space to show off and have some fun and they can do things they wouldn’t normally do. I love Matt [Damon] in the film. I think he’s fantastic as that character. I’ve never seen him do that before, and that’s great. I actually said, “Matt, I’ve got a small part. A few days work.” he said “Don’t bother, I’m in.” It’s nice to have friends like that, who will do it for the joy of doing it. And we had a great time. Working with Christoph was an utter joy because he’s thinking all the time; he’s questioning. He’s utterly brilliant. Personally I think it’s the best thing he’s ever done. It might not be the most popular thing he’s ever done, but he’s never off-screen and he’s just stunning.

I don’t think anyone has ever asked Tilda Swinton to rap in a movie before. It’s possible that nobody will again.

Yes. I think she’s sealed her fate. Her MTV career is over. Here’s a wonderful thing though: I had designed this hairstyle for her and when we first met to talk about what she was wearing and the wig, she had decided that as a psychiatrist she should identify more with her patient, so she should be bald. That wasn’t in the script. That was her idea.

I need to ask whether you’ve seen “Snowpiercer,” because if there was ever a film that you’re your influence, that one is it.

I desperately want to see “Snowpiercer”! But it hasn’t arrived in this country yet. Especially since John Hurt plays a character called Gilliam. I have seen the trailer and it looks fantastic. It looks really good and really beautiful.

Well, and I think Bong Joon-ho really followed your example, in some ways. Instead of coming to Hollywood and making the film in a way that required surrendering control, he made it in Europe with European money. He got a pretty large budget, I think it was $40 million, but it was assembled from all over the place — a bunch from different European financiers and a little bit from Harvey Weinstein. And then he didn’t have to make a movie the studios were going to chop up.

And yet Harvey still tried to chop it up. At least it got made. That’s the important thing.

I’m hearing rumors that you may finally make the Don Quixote film! What’s the level of truth or fiction there?

Yep. Well, we’ve postponed it. I was planning to shoot it in October this year, but because of the Python reunion show, I postponed it. The two lead actors, their agents and the producer are in discussion as we speak. And yeah, the plan is to be shooting it next springtime. We have locations in the Canary Islands already. I’m assuming we’re going to make it. I’m just suspending all my disbelief. [Laughs.]

Your entire career has involved a certain amount of doing that, right? It’s been required.

I just don’t want to accept the world as it is out there. We’ll see how that goes this time. [Laughs.] That movie has taken 15 years. It’s reached the point where I withhold a lot of my enthusiasm at the moment. I’m waiting to make sure everything is nailed down and then I’ll let go and make this thing happen. I’ve been at it so long it’s almost like it’s fake. It’s like trying to remove a tumor so I can get on with my life.

One more question: I know you gave up your American citizenship about eight years ago. I’m sure you have followed the news about Edward Snowden and the NSA stuff. Did that sum up some of the reasons why you didn’t want to be an American anymore? And secondly, didn’t the whole Snowden episode feel like something that would happen in one of your films?

Edward Snowden is a great hero, I think. It’s quite extraordinary what he’s done and yeah, all I know is I’m luckier than him. I live in England, not Russia. I was in Moscow a month and a half ago and as interesting as it is, I’m happier here. And yes, it’s part of the reason — with George W. Bush and that whole gang that has completely restructured what America is — and a Supreme Court that is so unbelievably right wing. It’s a country that is basically ruled by corporations at the expense of the citizens. I believe.

“The Zero Theorem” opens this week in many cities, and is also available on-demand from cable, satellite and online providers.

http://www.salon.com/2014/09/19/terry_gilliam_hollywood_is_just_%E2%80%9Cgray_frightened_people%E2%80%9D_holding_on_for_dear_life/?source=newsletter

“GRAPES OF WRATH” AND THE AMERICAN EXPERIENCE

https://wonderlandamericas.com/wp-content/uploads/2014/08/GRAPESOFWRATHJtumblr_kuyompgaSr1qzvsijo1_500.jpg

 

In a Labor Day weekend mood, I watched “Grapes of Wrath” again this evening.  Labor Day is, after all, a celebration of the American labor movement and is dedicated to the social and economic achievements of workers.  “Grapes of Wrath” portrays familiar themes in the American worker experience:  be it displaced farmers from Oklahoma to baristas and Twitter people with degrees, there is a continual struggle between workers and those with wealth desiring cheap, easily manipulated, labor.

The wealthy pretty much got their way in the States until the Depression (rich people gambling to get richer) fueled the re- balancing of the worker/owner relationship — more in favor of the worker– under FDR, and his New Deal.  This balance, which was great for the overall health of the country, continued through LBJ and the Great Society.  Now things are going the other way, with the wealthy neoliberal controller classes producing a political and economic system that assures their success no matter which of the two political parties wins.  Reagan, Clinton, Bush and now Obama dismantled the Great Society, fought to break the worker unions, and deregulated banking and other entities once deemed “public trusts.”  The resultant series of economic crises and bursting bubbles destroyed the working and middle classes and threatens to remove whats left of the social safety nets.

Tom Joad’s famous final speech (excerpts below) to his Ma in the movie “Grapes of Wrath” powerfully expressed the thoughts and yearnings of the Depression-period worker and resonates with the increasingly disenfranchised workers of today.  The American revolutionary, Tom Joad, espousing collective action that creates change, is a familiar subplot in the American drama.  What distresses me about this speech is Tom’s dream to spread wealth more justly “…if all our folks got together and yelled…”.  In this 21st century people yell for a few months (Occupy) and the illusion and control by the owners returns.  In the age of the “meh generation” and Ayn Rand the notion of a collective soul is anathema.

 

Tom Joad: I been thinking about us, too, about our people living like pigs and good rich land layin’ fallow. Or maybe one guy with a million acres and a hundred thousand farmers starvin’. And I been wonderin’ if all our folks got together and yelled…

Ma Joad: Tommy, you’re not aimin’ to kill nobody.

Tom Joad: No, Ma, not that. That ain’t it. It’s just, well as long as I’m an outlaw anyways… maybe I can do somethin’… maybe I can just find out somethin’, just scrounge around and maybe find out what it is that’s wrong and see if they ain’t somethin’ that can be done about it. I ain’t thought it out all clear, Ma. I can’t. I don’t know enough.

Ma Joad: How am I gonna know about ya, Tommy? Why they could kill ya and I’d never know. They could hurt ya. How am I gonna know?

Tom Joad: Well, maybe it’s like Casy says. A fellow ain’t got a soul of his own, just little piece of a big soul, the one big soul that belongs to everybody, then…

Ma Joad: Then what, Tom?

Tom Joad: Then it don’t matter. I’ll be all around in the dark – I’ll be everywhere. Wherever you can look – wherever there’s a fight, so hungry people can eat, I’ll be there. Wherever there’s a cop beatin’ up a guy, I’ll be there. I’ll be in the way guys yell when they’re mad. I’ll be in the way kids laugh when they’re hungry and they know supper’s ready, and when the people are eatin’ the stuff they raise and livin’ in the houses they build – I’ll be there, too.

http://www.realdanlyons.com/wp-content/uploads/2012/10/3399603679_82ecc46fa0_o1.jpeg

Letter To The Millennials

A Boomer Professor talks to his students

Written by

  • Director, USC Annenberg Innovation Lab. Producer, “Mean Streets”, “The Last Waltz”, “Until the End Of the World”, “To Die For”

So we are about to embark on a sixteen-week exploration of innovation, entertainment, and the arts. This course is going to be about all three, but I’m going to start with the “art” part — because without the art, no amount of technological innovation or entertainment marketing savvy is going to get you to go to the movie theater. However, I think there’s also a deeper, more controversial claim to be made along these same lines: Without the art, none of the innovation matters — and indeed, it may be impossible — because the art is what gives us vision, and what grounds us to the human element in all of this. Although there will be lectures, during which I’ll do my best to share what I’ve learned about the way innovation, entertainment, and the arts fit together, the most crucial part of the class is the dialogue between us, and specifically the insights coming from you as you teach me about your culture and your ideals. The bottom line is that the world has come a long way, but from my perspective, we’re also living in uniquely worrisome times; my generation had dreams of how to make a better life that have remained woefully unfulfilled (leaving many of us cynical and disillusioned), but at the same time your generation has been saddled with the wreckage of our attempts and are now facing what may seem to be insurmountable odds. I’m writing this letter in the hopes that it will help set the stage for a truly cross-generational dialogue over the next sixteen weeks, in which I help you understand the contexts and choices that have brought us where we are today, and in which you help me, and one another, figure out the best way to move forward from here.

When I was your age, I had my heart broken and my idealism challenged multiple times by the assassinations of my political heroes: namely, John and Bobby Kennedy and Martin Luther King. Many in my generation turned away from politics and found our solace in works of art and entertainment. So one of the things I want to teach you about is a time from 1965–1980 when the artists really ruled both the music and the film industries. Some said “the lunatics had taken over the asylum” (and, amusingly enough, David Geffen named his record company Asylum), but if you look at the quality of work that was produced, it was extraordinary; in fact, most of it is still watched and listened to today. Moreover, in that period the most artistic work also sold the best: The Beatles’ Sgt. Pepper was without doubt the best record of the year but also the best selling, and The Godfather was similarly both best movie of the year and the biggest box office hit. That’s not happening right now, and I want to try to understand why that is. I want to explore, with you, what the implications of this shift might be, and whether this represents a problem. It may be that those fifteen years your parents and I were lucky enough to experience was one of those renaissance moments that only come along once every century, so perhaps it’s asking too much to expect that I’ll see it occur again in my lifetime. Nevertheless, I do hope it happens at least once in yours.

I spoke of the heartbreak of political murder that has permanently marked me and my peers, but we have also been profoundly disappointed by politics’ failure to improve the lives of the average citizen. In 1969, the median salary for a male worker was $35,567 (in 2012 dollars). Today, it is $33,904. So for 44 years, while wages for the top 10% have continued to climb, most Americans have been caught in a “Great Stagnation,” bringing into question the whole purpose of the American capitalist economy (and, along the way, shattering our faith in the “American Dream”). The Reagan-era notion that what benefited the 1% — “the establishment” — would benefit everyone has by now been thoroughly discredited, yet it seems that we are still struggling to pick up the pieces after this failed experiment.

Seen through this lens, the savage partisanship of the current moment makes an odd kind of sense. What were the establishment priorities that moved inexorably forward in both Republican and Democratic administrations? The first was a robust and aggressive foreign policy. As Stephen Kinzer wrote about those in power during the 1950s, “Exceptionalism — the view that the United States has a right to impose its will because it knows more, sees farther, and lives on a higher moral plane than other nations — was to them not a platitude, but the organizing principle of daily life and global politics.”

From Eisenhower to Obama, this principle has been the guiding light of our foreign policy, bringing with it annual defense expenditures that dwarf those of all the world’s major powers combined. The second principle of the establishment was that “what is good for Wall Street is good for America.” Despite Democrats’ efforts to paint the GOP as the party of Wall Street, one would only have to look at the track record of Clinton’s treasury secretaries Rubin and Summers (specifically, their zealous efforts to kill the Glass-Steagal Act and deregulate the big banks and the commodities markets) to see that both major parties are guilty of sucking up to money; apparently, the establishment rules no matter who is in power. Was it any surprise, then, that Obama appointed the architects of bank deregulation, Summers and Geithner, to clean up the mess their policies had caused? Was it any surprise that they failed? Was it any surprise that establishment ideas about the surveillance state were not challenged by Obama? The good news is that, as a nation, we have grown tired of being the world’s unpaid cop, and we are tired of dancing to Wall Street’s tune. Slowly, we are learning that these policies may benefit the 1%, but they don’t benefit the people as a whole. My guess is the 2016 election may be fought on this ground, and we may finally begin to see real change, but the fact remains that we — both your generation and mine — are right now deeply mired in the fallout of unfulfilled promises and the failures of the political system.

So this is the source of boomer disillusionment. But even if we are cynical about political change, we can try to imagine together a future where great artistic work continues to flourish; this, then, is the Innovation and Entertainment part of the course. It’s not that I want you to give up on politics — in fact the events of the last few weeks in Ferguson only reinforce my belief that when people disdain politics, their anger gets channeled into violence. But what I do want you to think about is that art and culture are more plastic — they can be molded and changed easier than politics. There is a sense in which art, politics, and economics are all inextricably and symbiotically tied together, but history has proven to us that art serves as a powerful corrective against the dangers of the establishment. There is a system of checks and balances in which, even though the arts may rely on the social structures afforded by strong economic and political systems, artists can also inspire a culture to move forward, to reject the evils of greed and prejudice, and to reconnect to its human roots. If we are seeking a political and economic change, then, an authentic embrace of the arts may be key. Part of your role as communication scholars is to look more closely at the communication surrounding us and think critically about the effects its having, whose agenda is being promoted, and whether that’s the agenda that will serve us best. One of the tasks we’ll wrestle with in this class will be how we can get the digital fire hose of social media to really support artists, not just brands.

In 2011, the screenwriter Charlie Kaufman (Being John Malkovich, Adaptation) gave a lecture at the British Film Institute. He said something both simple and profound:

People all over the world spend countless hours of their lives every week being fed entertainment in the form of movies, TV shows, newspapers, YouTube videos and the Internet. And it’s ludicrous to believe that this stuff doesn’t alter our brains.

It’s also equally ludicrous to believe that — at the very least — this mass distraction and manipulation is not convenient for the people who are in charge. People are starving. They may not know it because they’re being fed mass produced garbage. The packaging is colorful and loud, but it’s produced in the same factories that make Pop Tarts and iPads, by people sitting around thinking, “What can we do to get people to buy more of these?

And they’re very good at their jobs. But that’s what it is you’re getting, because that’s what they’re making. They’re selling you something. And the world is built on this now. Politics and government are built on this, corporations are built on this. Interpersonal relationships are built on this. And we’re starving, all of us, and we’re killing each other, and we’re hating each other, and we’re calling each other liars and evil because it’s all become marketing and we want to win because we’re lonely and empty and scared and we’re led to believe winning will change all that. But there is no winning.

I think Charlie is right. People are starving, so we give them bread and circuses.

​ But I think Charlie is wrong when he says “there is no winning”. In fact I think we are really in a “winner-take-all” society. Look at the digital pop charts. 80% of the music streams are for 1% of the content. That means that Jay-Z and Beyoncé are billionaires, but the average musician can barely make a living. Bob Dylan’s first album only sold 4,000 copies. In this day and age, he would have been dropped by his label before he created his greatest work.

A writer I greatly admired, Gabriel García Márquez, died recently. For me, Márquez embodied the role of the artist in society, marked by the refusal to believe that we are incapable of creating a more just world. Utopias are out of favor now. Yet Marquez never gave up believing in the transformational power of words to conjure magic and seize the imagination. The other crucial aspect of Márquez’s work is that he teaches us the importance of regionalism. In a commercial culture of sameness where you can stroll through a mall in Shanghai and forget that you’re not in Los Angeles, Marquez’s work was distinctly Latin American. His work was as unique as the songs of Gilberto Gil, or the cinema of Alejandro González Iñárritu. In a cultural like ours that has so long advocated a “melting pot” philosophy that papers over our differences, it is valuable to recognize that there is a difference between allowing our differences to serve as barriers and appreciating the things that make each culture unique, situated in time and space and connected to its people. What’s more, young artists also need to have the sense of history that Marquez celebrated when he said, “I cannot imagine how anyone could even think of writing a novel without having at least a vague of idea of the 10,000 years of literature that have gone before.” Cultural amnesia only leads to cultural death.

With these values in mind, my hope is to lead you in a discussion of politics and culture in the context of 250 years of America’s somewhat utopian battle to build “a city on a hill.” I think many in my generation had this utopian impulse (which is, it should be observed, different than idealism), but it is slipping away like a short-term memory. I did not aspire to be that professor who quotes Dr. King, but I feel I must. He said the night before he was assassinated, “I may not get there with you, but I believe in the promised land.” My generation knew that the road towards a better society would be long, but we hoped our children’s children might live in that land, even if we weren’t able to get there with you. It may take even longer than we imagined, but I know your generation believes in justice and equality, and that fills me with hope that the dream of some sort of promised land is not wholly lost. The next step, then, is to figure out how to work together, to learn from the past while living in the present moment in order to secure a better future, and I believe this class offers us an incredible opportunity to do precisely that.

So what are the skills that we can develop together in order to open a real cross-generational dialogue? First, I would hope we would learn to improvise. I want you to challenge me, just as I encourage and challenge you. Improvisation means sometimes throwing away your notes and just responding from your gut to the ideas being presented. It takes both courage and intelligence, but I’m pretty sure you have deep stores of both qualities, which will help you show leadership both in class and throughout the rest of your life. Leadership is more than just bravery and intellect, however; it also requires vulnerability and compassion, skills that I hope we can similarly cultivate together. I want you to know that I don’t have all the answers — and, more importantly, I know that I don’t have all the answers. I am somewhat confused by our current culture and I am looking to you for insight. You need to have that same vulnerability with your peers, and you also need to treat them with compassion as you struggle together to understand this new world of disruption. I know these four elements — courage, intelligence, vulnerability, and compassion — may seem like they are working at cross-purposes, but we will need all four qualities if we are to take on the two tasks before us. One of our tasks is to try to restore a sense of excellence in our culture — the belief that great art and entertainment can also be popular. The second task is for baby boomer parents and their millennial children to form a natural political alliance going forward. As I’ve said, I don’t think the notion that we will get to “the promised land” is totally dead, and with your energy and the tools of the new media ecosystem to help us organize, we can keep working towards a newly hopeful society, culture, and economy, in spite of the mess we have left you with.

This is, at least, the plan. Of course, as the great critic James Agee once said, “Performance, in which the whole fate and terror rests, is another matter.”

 

 

https://medium.com/@jonathantaplin/letter-to-the-millennials-f90a5fb3b366

Hillary Clinton’s sobering recent interview is another example of how the Gipper’s sunny nationalism won’t go away

Reagan is still killing us: How his dangerous “American exceptionalism” haunts us today

 

Reagan is still killing us: How his dangerous "American exceptionalism" haunts us today
Ronald Reagan (Credit: AP/Ira Schwarz)

As the chaos in Missouri has reminded us this past week, the gap between what the United States is supposed to be and what it actually is remains more than large enough to fit a SWAT team or two. But while the always-childish fantasy of a post-racial America is choked by tear gas and pummeled by rubber and wooden bullets, the past few days have also seen the resurgence of another distinguishing aspect of the American character: Our unshakable belief in our own superiority, and our unwavering optimism that said superiority means we can right the world’s many wrongs.

I’m thinking, of course, of Hillary Clinton’s recent interview with Jeffrey Goldberg in the Atlantic, an interview that my colleague Joan Walsh rightly described as “sobering” for any progressive who’d resigned herself to a Clinton candidacy but hoped the former secretary of state had lost the martial inclination that likely cost her the presidency in 2008. Because while it’s true that some in the media (especially those with a neoconservative worldview) exaggerated the forcefulness of Clinton’s criticisms, it’s also true that Clinton reminded us that her view of the world differs from the president’s in some fundamental ways.

“You know, when you’re down on yourself, and when you are hunkering down and pulling back, you’re not going to make any better decisions than when you were aggressively, belligerently putting yourself forward,” Clinton told the hawkish Goldberg, implicitly arguing that Obama’s relative reluctance to send U.S. troops into other countries was no better than his predecessor’s belief that no problem was too big to be solved by an American with a gun. Sounding another dog-whistle for the unreconstructed neo-imperialists among us, she went on to complain that “we don’t even tell our own story very well these days,” chalking up America’s diminished global reputation not to its policies but rather its shoddy branding. (This is a move conservative Republicans pull after every election loss, which should tell you something of its intellectual merit.)



The key moment, however, was what came next, after Clinton’s use of the corporate “tell our story” cliché, when Goldberg said that “defeating fascism and communism is a pretty big deal.” “That’s how I feel!” was Clinton’s enthusiastic response, before she added, with faux modesty, that a belief in the U.S. as global savior “might be an old-fashioned idea,” but she’d keep it all the same. It was a striking exchange not just for its historical ignorance (when it comes to defeating both Nazism and Bolshevism, it’s the people in the Soviet Union and its satellites, not Americans, who deserve the credit most) but also for its schmaltziness and the way it put a folksy, heartland spin to a historical narrative that inexorably leads to militarism.

For all the ways he’s disappointed when it comes to ending the neo-imperial era of American foreign policy, President Obama doesn’t talk like this. He doesn’t go for the kind of rhetoric that places the U.S. as the protagonist and hero in a geopolitical drama of good vs. evil. And unlike Clinton, he doesn’t talk about groups of Islamic extremists as if they’re simply the latest versions of a cosmic evil that once took the forms of Nazi Germany and the USSR. He’s a nationalist, sure; and he certainly shares Clinton’s preference for a global order in which America pretends to be first among equals, when the real balance of power is anything but. Still, Obama, unlike Clinton, doesn’t talk about the world as if it were the stage for a great struggle between slavery and freedom. He knows that kind of talk was discredited by the results of our foreign policy from 2002 to 2008.

Weirdly, Clinton’s decision to speak about the U.S.’s role in global politics as if she, in contrast to Obama, was an unapologetic, “old-fashioned” believer in American exceptionalism made her sound like no one so much as Ronald Reagan, the last president who told a humbled America to buck up and forget its recent mistakes. Indeed, as MSNBC’s Chris Hayes noted this week when he hosted historian Rick Perlstein, whose new book on Reagan depicts him as absolver-in-chief, Clinton is seemingly “channeling Reagan in a very similar political moment” as that which confronted “the Gipper” after the trauma of the chaotic ’60s, Richard Nixon and Watergate. Pushing back against what Perlstein described as Obama’s attempt to inject “nuance” and “complexity” into our foreign policy debates, Clinton instead wants us to reclaim the “old-fashioned” belief that conquering “evil” is the special job of the exceptional USA.

It’s possible that this is all so much pre-campaign positioning on Clinton’s part, but I think she means it. After all, for someone planning to enter the Democratic Party presidential primary, being described as sounding like Ronald Reagan is, well, not great. But in fairness to Clinton, if you take a step back and listen to how most postwar presidents have spoken, she’s not breaking from the norm or doing something new. On the contrary, she’s signaling her intentions to return to a former status quo — it just happens to be one that poorly served most Americans as of late. Since the era of Reagan, and for much of the century before, most national-level politicians have exploited Americans’ characteristic optimism and belief in their country’s virtue to push a foreign policy that supposedly spread freedom, but really helps capital by meddling in the lives of poor people who live far, far away.

So here’s a prediction about Hillary Clinton and the 2016 presidential race. At one point or another, there will be a television ad in which Hillary Clinton will speak of bringing back the former glory of the United States. She’ll say it’s time to mark an end to nearly 20 years of terrorism, depression, war and defeat. It’s time to feel good again about being the leader of the free world. It’s morning in America; and everything is great.

 

Elias Isquith is an assistant editor at Salon, focusing on politics. Follow him on Twitter at @eliasisquith, and email him at eisquith@salon.com.

 

http://www.salon.com/2014/08/16/reagan_is_still_killing_us_how_his_dangerous_american_exceptionalism_haunts_us_today/?source=newsletter