HAPPY THANKSGIVING!!!

meditation_by_godricksucks-d4vlnm2

Photo “Meditation” by DJ Apollo.  All rights reserved.

 

Happy Thanksgiving to all of my readers!!!

I’m deeply thankful this year for the love of my spouse, Joey. We will be celebrating 36 years together next month. And for my parrot, Algeron: 25 years together. I hope all of you are having an amazing Thanksgiving day. Joey and I are watching the Macy’s Thanksgiving Parade from New York: a tradition.

It’s important to remember, however, that for many, myself included, the beginning of the “holiday season” is not all fun and merriment. The holidays, the commercial from “Blue Buffalo” that just came on announced, are all about family. Most of my blood family have passed on: mother, father, grandparents, aunts and uncles, even my only sibling, my younger brother, and my only child, my beloved daughter Amoret. One cannot help but reflect on these losses and conjure up the memories of those who are gone. On the bright side I am happy that I made the acquaintance of my nephew, David, in Florida this year as well as renewed friendship with my cousin Lorraine, also in Florida. My grandchildren and nieces and nephews are scattered about the country. Modern life. When I was growing up family was a short drive away.

The worst part of the “holiday season” for me is the frantic commercialism. I refuse to participate in it. But it’s hard to avoid the media blitz. Since we mostly watch movies on the TV these days we are not subject to the horrid commercials. The mailbox, however, fills every day with people asking for money and AdBlock Plus can’t filter out all of the holiday money-focused “cheer” on the net. I genuinely hate all of this and wish I could move to another place with no “holiday season” until it all passes.

Finally, this time of year has become a period of meditation for me on the past, present and future. My situation is such that every day is precious. I’ve been dancing with death for some time now, and I know that my time is limited. Joey’s medical crisis last Sunday is another terrifying reminder of the limitations of human existence. In this state of mind the trivial becomes unimportant; things lose meaning and value; only love and giving of oneself are primary.

Chronicle of a Riot Foretold

FERGUSON, Missouri—For a hundred and eight days, through the suffocating heat that turned the city into a kiln, through summer thunderstorms and the onset of an early winter, through bureaucratic callousness and the barbs of cynics who held that the effort was of no use and the prickly fear that they might be right, a community in Ferguson, Missouri, held vigil nightly, driven by the need to validate a simple principle: black lives matter. On November 24, 2014, we learned that they do indeed matter, just less than others—less than the prerogatives of those who wield power here, less than even the cynics may have suspected.

Last night, the streets of Ferguson were congested with smoke and anger and disillusionment and disbelief, and also with batons and the malevolent percussion of gunfire and the hundreds of uniformed men brought here to marshal and display force. Just after eight on Monday evening, after a rambling dissertation from the St. Louis County Prosecutor, Robert McCulloch, that placed blame for tensions on social media and the twenty-four-hour news cycle, and ended with the announcement that the police officer Darren Wilson would not be indicted for shooting Michael Brown six times, the crowd that gathered in front of the police headquarters, on South Florissant Road, began to swell. Their mood was sombre at first, but some other sentiment came to the fore, and their restraint came unmoored. A handful of men began chanting “Fuck the police!” in front of the line of officers in riot gear that had gathered in front of the headquarters. Gunshots, the first I heard that night, cut through the air, and a hundred people began drifting in the direction of the bullets. One man ripped down a small camera mounted on a telephone pole. A quarter mile away, the crowd encountered an empty police car and within moments it was aflame. A line of police officers in military fatigues and gas masks turned a corner and began moving north toward the police building. There were four hundred protesters and nearly that many police officers filling an American street, one side demanding justice, one side demanding order, both recognizing that neither of those things was in the offing that night.

What transpired in Ferguson last night was entirely predictable, widely anticipated, and, yet, seemingly inevitable. Late last week, Michael Brown, Sr., released a video pleading for calm, his forlorn eyes conveying exhaustion born of not only shouldering grief but also of insisting on civic calm in the wake of his son’s death. One of the Brown family’s attorneys, Anthony Gray, held a press conference making the same request, and announced that a team of citizen peacekeepers would be present at any subsequent protests. Ninety minutes later, the St. Louis mayor, Francis Slay, held a press conference in which he pledged that the police would show restraint in the event of protests following the grand-jury decision. He promised that tear gas and armored vehicles would not be deployed to manage protests. The two conferences bore a disturbing symmetry, an inversion of pre-fight hype in which each side deprecated about possible violence but expressed skepticism that the other side was capable of doing the same. It’s possible that, recognizing that violence was all but certain, both sides were seeking to deflect the charge that they had encouraged it. Others offered no such pretense. Days ahead of the announcement, local businesses began boarding up their doors and windows like a coastal town anticipating a hurricane. Missouri Governor Jay Nixon declared a preëmptive state of emergency a week before the grand jury concluded its work. His announcement was roughly akin to declaring it daytime at 3 A.M. because the sun will rise eventually.

From the outset, the great difficulty has been discerning whether the authorities are driven by malevolence or incompetence. The Ferguson police let Brown’s body lie in the street for four and a half hours, an act that either reflected callous disregard for him as a human being or an inability to manage the situation. The release of Darren Wilson’s name was paired with the release of a video purportedly showing Brown stealing a box of cigarillos from a convenience store, although Ferguson police chief Tom Jackson later admitted that Wilson was unaware of the incident when he confronted the young man. (McCulloch contradicted this in his statement on the non-indictment.) Last night, McCulloch made the inscrutable choice to announce the grand jury’s decision after darkness had fallen and the crowds had amassed in the streets, factors that many felt could only increase the risk of violence. Despite the sizable police presence, few officers were positioned on the stretch of West Florissant Avenue where Brown was killed. The result was that damage to the area around the police station was sporadic and short-lived, but Brown’s neighborhood burned. This was either bad strategy or further confirmation of the unimportance of that community in the eyes of Ferguson’s authorities.

The pleas of Michael Brown’s father and Brown’s mother, Lesley McSpadden, were ultimately incapable of containing the violence that erupted last night, because in so many ways what happened here extended beyond their son. His death was a punctuation to a long, profane sentence, one which has insulted a great many, and with damning frequency of late. In his statement after the decision was announced, President Barack Obama took pains to point out that “there is never an excuse for violence.” The man who once told us that there was no black America or white America but only the United States of America has become a President whose statements on unpunished racial injustices are a genre unto themselves. Perhaps it only seems contradictory that the deaths of Oscar Grant and Trayvon Martin, Ezell Ford and John Crawford and Michael Brown—all unarmed black men shot by men who faced no official sanction for their actions—came during the first black Presidency.* Or perhaps the message here is that American democracy has reached the limits of its elasticity—that the symbolic empowerment of individuals, while the great many remain citizen-outsiders, is the best that we can hope for. The air last night, thick with smoke and gunfire, suggested something damning of the President.

*Correction: An earlier version of this post conflated the names of Ezell Ford and John Crawford.

You should actually blame America for everything you hate about internet culture

November 21

The tastes of American Internet-users are both well-known and much-derided: Cat videos. Personality quizzes. Lists of things that only people from your generation/alma mater/exact geographic area “understand.”

But in France, it turns out, even viral-content fiends are a bit more … sophistiqués.

“In France, articles about cats do not work,” Buzzfeed’s Scott Lamb told Le Figaro, a leading Parisian paper. Instead, he explained, Buzzfeed’s first year in the country has shown it that “the French love sharing news and politics on social networks – in short, pretty serious stuff.”

This is interesting for two reasons: first, as conclusive proof that the French are irredeemable snobs; second, as a crack in the glossy, understudied facade of what we commonly call “Internet culture.”

When the New York Times’s David Pogue tried to define the term in 2009, he ended up with a series of memes: the “Star Wars” kid, the dancing baby, rickrolling, the exploding whale. Likewise, if you look to anyone who claims to cover the Internet culture space — not only Buzzfeed, but Mashable, Gawker and, yeah, yours truly — their coverage frequently plays on what Lamb calls the “cute and positive” theme. They’re boys who work at Target and have swoopy hair, videos of babies acting like “tiny drunk adults,” hamsters eating burritos and birthday cakes.

That is the meaning we’ve assigned to “Internet culture,” itself an ambiguous term: It’s the fluff and the froth of the global Web.

But Lamb’s observations on Buzzfeed’s international growth would actually seem to suggest something different. Cat memes and other frivolities aren’t the work of an Internet culture. They’re the work of an American one.

American audiences love animals and “light content,” Lamb said, but readers in other countries have reacted differently. Germans were skeptical of the site’s feel-good frivolity, he said, and some Australians were outright “hostile.” Meanwhile, in France — land of la mode and le Michelin — critics immediately complained, right at Buzzfeed’s French launch, that the articles were too fluffy and poorly translated. Instead, Buzzfeed quickly found that readers were more likely to share articles about news, politics and regional identity, particularly in relation to the loved/hated Paris, than they were to share the site’s other fare.

A glance at Buzzfeed’s French page would appear to bear that out. Right now, its top stories “Ça fait le buzz” — that’s making the buzz, for you Americaines — are “21 photos that will make you laugh every time” and “26 images that will make you rethink your whole life.” They’re not making much buzz, though. Neither has earned more than 40,000 clicks — a pittance for the reigning king of virality, particularly in comparison to Buzzfeed’s versions on the English site.

All this goes to show that the things we term “Internet culture” are not necessarily born of the Internet, itself — the Internet is everywhere, but the insatiable thirst for cat videos is not. If you want to complain about dumb memes or clickbait or other apparent instances of socially sanctioned vapidity, blame America: We started it, not the Internet.

Appelons un chat un chat.

Caitlin Dewey runs The Intersect blog, writing about digital and Internet culture. Before joining the Post, she was an associate online editor at Kiplinger’s Personal Finance.
http://www.washingtonpost.com/news/the-intersect/wp/2014/11/21/you-should-actually-blame-america-for-everything-you-hate-about-internet-culture/

Every sci-fi movie since Kubrick’s 1968 masterpiece has echoed the original in certain unavoidable ways

Kubrick’s indestructible influence: “Interstellar’’ joins the long tradition of borrowing from “2001’’

Kubrick's indestructible influence: "Interstellar’’ joins the long tradition of borrowing from "2001’’
“2001: A Space Odyssey” and “Interstellar” (Credit: Warner Bros./Salon)

When I first heard about Christopher Nolan’s new sci-fi adventure, “Interstellar,” my immediate thought was only this: Here comes the latest filmmaker to take on Stanley Kubrick’s “2001: A Space Odyssey.” Though it was released more than 40 years ago, ”2001″ remains the benchmark for the “serious” science fiction film: technical excellence married to thematic ambition, and a pervading sense of historic self-importance.

More specifically, I imagined that Nolan would join a long line of challengers to aim squarely at “2001’s” famous Star Gate sequence, where astronaut Dave Bowman (Keir Dullea) passes through a dazzling space-time light show and winds up at a waystation en route to his transformation from human being into the quasi-divine Star Child.

The Star Gate scene was developed by special effects pioneer Douglas Trumbull, who modernized an old technique known as slit scan photography (you can learn more about it here). While we’ve long since warp-drived our way beyond the sequence effects-wise (you can now do slit scan on your phone), the Star Gate’s eerie and propulsive quality is still powerful, because it functions as much more than just eye candy. It’s a set piece whose theme is the attempt to transcend set pieces — and character, and narrative and, most of all, the technical limitations of cinema itself.

In “2001,” the Star Gate scene is followed by another scene that also turns up frequently in sci-fi flicks. Bowman arrives at a series of strange rooms, designed in the style of Louis XVI (as interpreted by an alien intelligence), and he watches himself age and die before being reborn. Where is he? Another galaxy? Another dimension? Heaven? Hell? What are the mysterious monoliths that have brought him here? Why?

Let’s call this the Odd Room Scene. Pristine and uncanny, the odd room is the place at the end of the journey where truths of all sorts, profound and pretentious, clear and obscure, are at last revealed. In “The Matrix Reloaded,” for instance, Neo’s Odd Room Scene is his meeting with an insufferable talking suit called the Architect, where he learns the truth about the Matrix. Last summer’s “Snowpiercer,” about a train perpetually carrying the sole survivors of a new Ice Age around the world, follows the lower-class occupants of the tail car as they stage a revolution, fighting and hacking their way through first class toward the train’s engine, an Odd Room where our hero learns the truth about the train.



These final scenes in “2001″ still linger in the collective creative consciousness as inspiration or as crucible. The Star Gate and the Odd Room, particular manifestations of the journey and the revelation, have become two key architectural building blocks of modern sci-fi films. The lure to imitate and try to top these scenes, either separately or together, is apparently too powerful to resist.

Perhaps the most literal of the Star Gate-Odd Room imitators is Robert Zemeckis’s 1997 “Contact.” It’s a straightforward drama about humanity’s efforts to build a large wormhole machine whose plans have been sent by aliens, and the debate over which human should be the first to journey beyond the solar system. The prize falls to Jodie Foster’s agnostic astronomer Ellie Arroway. During the film’s Star Gate sequence, Foster rides a capsule through a wormhole that winds her around distant planets and through a newly forming star. Zemeckis’s knockoff is a decent roller coaster, but nothing more. Arroway is anxious as she goes through the wormhole, but still in control of herself; a deeply distressed Bowman, by contrast, is losing his mind.

Arroway’s wormhole deposits her in an Odd Room that looks to her (and us) like a beach lit by sunlight and moonlight. She is visited by a projection of her dead father, the aliens’ way of appearing to her in a comfortable guise, and she learns the stunning truth about … well, actually, she doesn’t learn much. Her father gives her a Paternal Alien Pep Talk. Yes, there is a lot of life out in the galaxy. No, you can’t hang out with us. No, we’re not going to answer any of your real questions. Just keep working hard down there on planet Earth; you’ll get up here eventually (as long as you all don’t kill each other first).

Brian De Palma tried his own version of the Odd Room at the end of 2000’s “Mission to Mars,” which culminates in a team of astronauts entering a cool, Kubrick-like room in an alien spaceship on Mars and, yes, learning the stunning truth about the origins of life on Earth. De Palma is a skilled practitioner of the mainstream Hollywood set piece, but you can feel the film working up quite a sweat trying and failing to answer “2001,” and early-century digital effects depicting red Martians are, to be charitable, somewhat dated.

But here comes “Interstellar.” This film would appear to be the best shot we’ve had in years to challenge the supremacy of the Star Gate, of “2001″ itself, as a Serious Sci-Fi Film About Serious Ideas. Christopher Nolan should be the perfect candidate to out-Star Gate the Star Gate. Kubrick machined his visuals to impossibly tight tolerances. Nolan (along with his screenwriter brother Jonathan) do much the same to their films’ narratives, manufacturing elaborately conceived contraptions. The film follows a Hail Mary pass to find a planet suitable for the human race as the last crops on earth begin to die out. Matthew McConaughey plays an astronaut tasked with piloting a starship through a wormhole, into another galaxy and onto a potentially habitable planet. “Interstellar” promises a straight-ahead technological realism as well as a sense of conscious “We’re pushing the envelope” ambition. (Hey, even Neil deGrasse Tyson vouches for the film’s science bonafides.) The possibilities and ambiguities of time, one of Nolan’s consistent concerns as a storyteller, is meant, I think, to be the trump card that takes “Interstellar” past “2001.”

But the film is not about fealty to, or the realistic depiction of, relativity theory. It’s about “2001.” And before it can try to usurp the throne, “Interstellar” must first kiss the ring. (And if you haven’t seen “Interstellar” yet, you might want to stop reading now.) So we get the seemingly rational crewmember who proves to be homicidal. The dangerous attempt to manually enter a spaceship. More brazenly, there’s a set piece of one ship docking with another. In “2001,” the stately docking of a spaceship with a wheel-shaped space station, turning gently above the Earth to the strains of the Blue Danube was, quite literally, a waltz, a graceful celestial courtship. It clued us in early that the machines in “2001″ would prove more lively, more human, than the humans. “Interstellar” assays the same moment, only on steroids. It turns that waltz, so rich in subtext, into a violent, vertiginous fandango as a shuttle tries to dock with a mothership that’s pirouetting out of control.

Finally, after a teasing jaunt through a wormhole earlier in the movie, we come to “Interstellar’s” Star Gate moment, as Cooper plummets into a black hole and ultimately into a library-like Odd Room that M.C. Escher might have fancied. It’s visually impressive for a moment, but its imprint quickly fades.

It’s too bad.” Interstellar” wants the stern grandeur of “2001″ and the soft-hearted empathy of Steven Spielberg, but in most respects achieves neither. Visually only a few images impress themselves in your brain — Nolan, as is often the case in his movies, is more successful designing and calibrating his story than at creating visuals worthy of his ambition. Yet the film doesn’t manage the emotional dynamics, either. It’s not for lack of trying. The Nolan brothers are rigorous scenarists, and the concept of dual father-daughter bonds being tested and reaffirmed across space-time is strong enough on the drawing board. (Presumably, familial love is sturdier than romantic love, though the film makes a half-hearted stab at the latter.)

For those with a less sentimental bent, the thematic insistence on the primacy of love might seem hokey, but it’s one way the film tries to advance beyond the chilly humanism of Kubrick toward something more warm-blooded. Besides, when measured against the stupefying vastness of the universe, what other human enterprise besides love really matters? The scale of the universe and its utter silence is almost beyond human concern, anyway.

So I don’t fault a film that suggests that it’s love more than space-age alloys and algorithms that can overcome the bounds of space and time. But the big ideas Nolan is playing with are undercut by too much exposition about what they mean. The final scene between Cooper and his elderly daughter — the triumphant, life-affirming emotional home run — is played all wrong, curt and businesslike. It’s a moment Spielberg would have handled with more aplomb; he would have had us teary-eyed, for sure, even those who might feel angry at having their heartstrings yanked so hard. This is more like having a filmmaker give a lecture on how to pull at the heartstrings without actually doing it.

Look, pulling off these Star Gate-like scenes requires an almost impossible balance. The built-in expectations in the structure of the story itself are unwieldy enough, without the association to one of science fiction’s most enduring scenes. You can make the transcendent completely abstract, like poetry, a string of visual and aural sensations, and hope viewers are in the right space to have their minds blown, but you run the risk of copping out with deliberate obfuscation. (We can level this charge at the Star Gate sequence itself.)

But it’s easy to press too far the other way — to personify the higher power or the larger force at the end of these journeys with a too literal explanation that leaves us underwhelmed. I suppose what we yearn for is just a tiny revelation, one that honors our desire for awe, preserves a larger mystery, but is not entirely inaccessible. It’s a tiny taste of the sublime. There’s an imagined pinpoint here where we would dream of transcendence as a paradox, as having God-like perception and yet still remaining human, perhaps only for a moment before crossing into something new. For viewers, though, the Star Gate scenes ultimately play on our side of that crossroads: To be human is to steal a glimpse of the transcendent, to touch it, without transcending.

While Kubrick didn’t have modern digital effects to craft his visuals with, in retrospect he had the easier time of it. It’s increasingly difficult these days to really blow an audience’s minds. We’ve seen too much. We know too much. The legitimate pleasure we can take in knowledge, in our ability to decode an ever-more-complex array of allusions and references, may not be as pleasurable or meaningful as truly seeing something beyond what we think we know.

Maybe the most successful challenger to Kubrick was Darren Aronofsky and his 2006 film “The Fountain.” The film, a meditation on mortality and immortality, plays out in three thematically-linked stories: A conquistador (Hugh Jackman) searches the new world for the biblical Tree of Life; a scientist (Jackman again) tries to save his cancer-stricken wife (Rachel Weisz), and a shaven-headed, lotus-sitting traveler (Jackman once more) journeys to a distant nebula. It’s the latter that bears the unique “2001″ imprint of journey and revelation: Jackman travels in a bubble containing the Tree of Life, through a milky and golden cosmicscape en route to his death and rebirth. It’s the Star Gate and the Odd Room all in one. Visually, Aronofsky eschewed computer-generated effects for a more organic approach that leans on fluid dynamics. I won’t tell you the film is a masterpiece — its Grand Unifying ending is more than a little inscrutable; again, pulling this stuff of is a real tightrope — but the visuals are wondrous and unsettling, perhaps the closest realization since the original of what the Star Gate sequence is designed to evoke.

Having said that, though, it may be time to turn away from the Star Gate in our quest for the mind-blowing sci-fi cinematic sequence. Filmmakers have thus far tried to imagine something like it, only better, and have mostly failed. It’s harder to imagine something beyond it, something unimaginable. Maybe future films should not be quite so literal in their chasing of those transcendent moments. This might challenge a new generation of filmmakers while also allowing the Star Gate, and “2001″ itself, to lie fallow for awhile, so we can return to it one day with fresh eyes.

It is, after all, when we least suspect it that a story may find a way past our jaded eyes and show us a glimpse of something that really does stir a moment of profound connection. There is one achingly brief moment in “Interstellar” that accomplishes this: Nolan composes a magnificent shot of a small starship, seen from a great distance gliding past Saturn’s awesome rings. The ship glitters in a gentle rhythm as it catches light of the Sun. It’s a throwaway, a transitional moment between one scene and another, merely meant to establish where we are. But its very simplicity and beauty, the power of its scale, invites us for a moment to experience the scale of the unknown and to appreciate our efforts to find a place in it, or beyond it.

http://www.salon.com/2014/11/22/kubricks_indestructible_influence_interstellar_joins_the_long_tradition_of_borrowing_from_2001/?source=newsletter

Follow

Get every new post delivered to your Inbox.

Join 1,618 other followers