Bill Maher’s bigoted atheism

His showdown with Fareed Zakaria shows how far he’s fallen

Bill Maher's bigoted atheism: His arrogant shtick is just as ugly as religious intolerance
Bill Maher (Credit: HBO/Janet Van Ham)

You know what you call someone who makes sweeping generalizations on billions of people based on the extreme actions of a few? A bigot. Bill Maher, for example, is a bigot. And if you’re a fan of his smug, dismissive shtick, you’re a bigot too.

On Friday’s “Real Time,” Maher, who has been openly atheist his whole career but has been increasingly vocal against organized religion in recent years, squared off against Fareed Zakaria, who gave a powerful rebuttal to Maher’s reiteration of the “Islam is the motherlode of bad ideas” assertion. “My problem with the way you approach it,” Zakaria said, “is I don’t think you’re going to reform a religion by telling 1.6 billion people — most of whom are just devout people who get some inspiration from that religion and go about their daily lives — I don’t think you’re going to change religion by saying your religion is the motherlode of bad ideas, it’s a terrible thing. Frankly, you’re going to make a lot of news for yourself and you’re going to get a lot of applause lines and joke lines.” Instead, he urged, “Push for reform with some sense of respect for the spiritual values.”And on behalf of Muslims, Christians, Jews and anybody else who prays to somebody sometimes, let me just say, thank you.

As the threats of terrorism and right-wing Christianity have risen in the past few years, Maher’s aggressive brand of atheism — also popularized by the likes of Richard Dawkins and Sam Harris — has gained a strong following among a certain type of self-professed intellectual. Maher has famously said, “Religion is dangerous because it allows human beings who don’t have all the answers to think that they do” — which is pretty funny, given the know-it-all arrogance of the anti-religion big leaguers like Maher himself. As Zakaria very eloquently pointed out, that stance has given Maher more power and reach than he’s ever had in his long career. But whatever you believe or don’t, if you’re selling blanket intolerance, you don’t get to call yourself one of the good guys. You shouldn’t even get to call yourself one of the smart ones.

I’m a Christian, which in my urban, media-centric world is basically equivalent to self-identifying as a hillbilly. It also means that I have to accept that I apply the same word to myself that a lot of hateful morons do. But on Sunday at my little neighborhood church, our priest delivered a sermon in which he said, “I can’t understand how in places like Indiana, people are using Christianity as an excuse to close their doors, when we should be welcoming to everyone.” Guess what? That’s faith too. I am also keenly aware that in other parts of the world, people are being murdered for a faith that I am privileged to practice openly and without fear. And anyone, anywhere, who is openly hateful to others for their religion is part of a culture that permits that kind of persecution to endure.

Here’s what I would like Bill Maher and his smug, self-righteous acolytes to understand. There are literally billions of individuals in this world who are not murderous, ignorant, superstitious, hatemongers, who also happen to practice a religion. Billions of people who I swear — to God — have no investment in forcing their beliefs on Bill Maher. Right here in the U.S., there are millions of my fellow Christians who are strongly committed to the ideals of the Constitution, and who don’t want to live in a theocracy any more than they do.

I recently had a conversation with an atheist friend who asked why, knowing all I do of the wrongs committed by the Catholic Church, disagreeing as strongly as I do with many of its positions on women’s rights, LGBT equality and reproductive justice, I continue to stay within it. And my reply was that this is where I feel I can do the most good. I am not a disinterested party. I’m a citizen of my church and I’m going to continue to demand better of it. I don’t, however, want to sell it to anybody else. You don’t have to believe in God — or however else you may define the concept of something else out there. I don’t have all the answers to life, the universe and everything; I’m just trying to get through this plane of existence in a manner that’s philosophically satisfying and guides me in the direction of not being a selfish jerk. That’s it. All I ask — all that many, many, many of us who practice their respective religions ask — is that you conduct yourself with respect and compassion and a spirit of coexistence, and we’ll do the same. I ask that you not make assumptions about the vast majority of the world’s population based on your own need to feel good about yourself and how smart you are. Like Zakaria says, you’re not going to bring about reform that way. And as Maher and his ilk prove, you don’t need a religion to be in the business of spreading hate.

Mary Elizabeth Williams is a staff writer for Salon and the author of “Gimme Shelter: My Three Years Searching for the American Dream.” Follow her on Twitter: @embeedub.

 

http://www.salon.com/2015/04/13/bill_mahers_atheism_is_just_bigotry/?source=newsletter

Nimoy transformed the classic intellectual, self-questioning archetype…

 …into a dashing “Star Trek” action hero

How Leonard Nimoy made Spock an American Jewish icon
Leonard Nimoy as Spock on “Star Trek” (Credit: CBS)

I suspect I can speak for most American Jews when I say: Before I’d watched even a single episode of “Star Trek,” I knew about Leonard Nimoy.

Although there are plenty of Jews who have achieved fame and esteem in American culture, only a handful have their Jewishness explicitly intertwined with their larger cultural image. Much of the difference has to do with how frequently the celebrity in question alludes to his or her heritage within their body of work. This explains why, for instance, a comedian like Adam Sandler is widely identified as Jewish while Andrew Dice Clay is not, or how pitcher Sandy Koufax became famous as a “Jewish athlete” after skipping Game 1 of the 1965 World Series to observe Yom Kippur, while wide receiver Julian Edelman’s Hebraic heritage has remained more obscure.

With this context in mind, it becomes much easier to understand how Nimoy became an iconic figure in the American Jewish community. Take Nimoy’s explanation of the origin of the famous Vulcan salute, courtesy of a 2000 interview with the Baltimore Sun: “In the [Jewish] blessing, the Kohanim (a high priest of a Hebrew tribe) makes the gesture with both hands, and it struck me as a very magical and mystical moment. I taught myself how to do it without even knowing what it meant, and later I inserted it into ‘Star Trek.’”

Nimoy’s public celebration of his own Jewishness extends far beyond this literal gesture. He has openly discussed experiencing anti-Semitism in early-20th century Boston,speaking Yiddish to his Ukrainian grandparents, and pursuing an acting career in large part due to his Jewish heritage. “I became an actor, I’m convinced, because I found a home in a play about a Jewish family just like mine,” Nimoy told Abigail Pogrebin in “Stars of David: Prominent Jews Talk About Being Jewish.” “Clifford Odets’s ‘Awake and Sing.’ I was seventeen years old, cast in this local production, with some pretty good amateur and semiprofessional actors, playing this teenage kid in this Jewish family that was so much like mine it was amazing.”



Significantly, Nimoy did not disregard his Jewishness after becoming a star. Even after his depiction of Dr. Spock became famous throughout the world, Nimoy continued to actively participate in Jewish causes, from fighting to preserve the Yiddish language and narrating a documentary about Hasidic Jews to publishing a Kabbalah-inspired book of photography, The Shekhina Project, which explored “the feminine essence of God.” He even called for peace in Israel by drawing on the mythology from “Star Trek,” recalling an episode in which “two men, half black, half white, are the last survivors of their peoples who have been at war with each other for thousands of years, yet the Enterprise crew could find no differences separating these two raging men.” The message, he wisely intuited, was that “assigning blame over all other priorities is self-defeating. Myth can be a snare. The two sides need our help to evade the snare and search for a way to compromise.”

As we pay our respects to Nimoy’s life and legacy, his status as an American Jewish icon is important in two ways. The first, and by far most pressing, is socio-political: As anti-Semitism continues to rise in American colleges and throughout the world at large, it is important to acknowledge beloved cultural figures who not only came from a Jewish background, but who allowed their heritage to influence their work and continued to participate in Jewish causes throughout their lives. When you consider the frequency with which American Jews will either downplay their Jewishness (e.g., Andy Samberg) or primarily use it as grounds for cracking jokes at the expense of Jews (e.g., Matt Stone of “South Park”), Nimoy’s legacy as an outspokenly pro-Jewish Jew is particularly meaningful right now.

In addition to this, however, there is the simple fact that Nimoy presented American Jews with an archetype that was at once fresh and traditional. The trope of the intellectual, self-questioning Jew has been around for as long as there have been Chosen People, and yet Nimoy managed to transmogrify that character into something exotic and adventurous. Nimoy’s Mr. Spock was a creature driven by logic and a thirst for knowledge, yes, but he was also an action hero and idealist when circumstances demanded it. For the countless Jews who, like me, grew up as nerds and social outcasts, it was always inspiring to see a famous Jewish actor play a character who was at once so much like us and yet flung far enough across the universe to allow us temporary escape from our realities. This may not be the most topically relevant of Nimoy’s legacies, but my guess is that it will be his most lasting as long as there are Jewish children who yearn to learn more, whether by turning inward into their own heritage or casting their gaze upon the distant stars.

Matthew Rozsa is a Ph.D. student in history at Lehigh University as well as a political columnist. His editorials have been published in “The Morning Call,” “The Express-Times,” “The Newark Star-Ledger,” “The Baltimore Sun,” and various college newspapers and blogs. He actively encourages people to reach out to him at matt.rozsa@gmail.com

 

German television series Tannbach and German postwar history

By Sybille Fuchs
9 February 2015

The three-part series Tannbach—Fate of a Village broadcast by ZDF [German public-service television], which achieved high ratings in January, attempted to follow on from the great success of the trilogy Generation War (2013). The latter dealt with the impact on five young Germans of the crimes of the Nazi regime. Tannbachattempts to encompass the history of divided Germany in the years following the Second World War by dramatising the fate of the inhabitants of a small village on the border between the two Germanys.

Tannbach

ZDF broadcasting director Norbert Himmler asserts that the television series “tells how it all began: from our roots in the postwar Germany of both republics, the German Democratic Republic [former Stalinist East Germany, GDR] and the Federal Republic of Germany [former capitalist West Germany, FRG]”. This claim, however, is misleading. Despite some excellent performances, the series is loaded with clichés and prejudices, which it often promotes in quite an embarrassing way.

With respect to both the “West Germans’ view of East Germany” and “East Germans’ formulaic attitudes and prejudices”, screenwriters Josephin and Robert Thayenthal fail to critically examine issues in any depth. Despite the supposedly “objective picture” of the times, the view upheld in official propaganda since the demise of the GDR predominates: that in eastern Germany one kind of totalitarian dictatorship [Nazism] was replaced by another [Stalinism], which was no less brutal and cruel than the first. The screenwriters themselves speak of the “two great German dictatorships of the twentieth century”.

This equation of two completely different regimes—on the one hand, the Hitler dictatorship, which destroyed the labour movement in the interests of German business, unleashed the Second World War and murdered millions of Jews, Gypsies, Roma, disabled persons and prisoners of war; and, on the other, the Stalinist dictatorship, which nationalised large estates and industries, but suppressed workers’ democracy in order to secure the rule of a privileged bureaucracy—precludes the possibility of any realistic and credible representation of the period. Tannbach tends to present viewers with stock figures rather than human characters.

The US forces, who initially occupy the village at the end of the war, are generous, benevolent and “cool”. The Soviets, who later take over from the Americans, descend on the defenceless villagers like barbarian hordes. At the end of Part 1, the first appearance of the Soviet military concludes with their shooting of an innocent old man, a mother and a child, simply because a portrait of Hitler is found in a drawer. This pattern of presentation runs through the whole film.

Trying to balance this one-sided view by including two “good” communists fails to make things better. Both of them—Konrad Werner (Ronald Zehrfeld), who has returned from exile in the USSR, and Friedrich Erler (Jonas Nay), the son of a Communist murdered by the Nazis—appear naive and implausible in their idealistic belief in a better future.

The film’s scriptwriters are so unaware of their own prejudices that they even reproduce the kind of bigotry characteristic of the Nazi era. Of the two young friends who flee to Tannbach from the rubble of Berlin, Friedrich Erler (non-Jew) becomes a farmer, while Lothar Erler (a Jew) ends up a smuggler. The writers should be ashamed of themselves.

“The Morning After the War”

Tannbach is a fictional village on the Thuringian-Bavarian border. It is based on the actual village of Mödlareuth in Upper Franconia-Thuringia, which was split between the US and Soviet occupation zones in 1945. The stream running through that small village is the Tannbach. The three-part series was actually filmed in Besno in the Czech Republic. Considerable effort was put into the attempt to make details of scenery, costumes and props as historically accurate as possible.

Part 1, “The Morning After the War”, begins in the last days of World War II. Just before US troops break into the estate of Count Georg von Striesow (Heiner Lauterbach), a young SS officer (David Zimmerschied) has the Countess (Natalie Wörner) shot because she refuses to betray her husband who has returned from the war as a deserter.

The count was denounced by Franz Schober (Alexander Held), a prosperous farmer and fanatical Nazi, who immediately offers to serve the Americans with his meticulously recorded insider knowledge of Nazi members and their activities. The SS officer, Schober’s illegitimate son, is exposed to the Americans by his own mother, Hilde (Martina Gedeck).

In any event, the US occupation is brief. Thuringia is assigned to the Soviet occupation zone, while Bavaria remains under American control. Soviet troops take over the village. Following a later revision of demarcation lines, US troops return to the western side of the village, which is divided down the middle.

The Soviet soldiers are portrayed as violent thugs, taking revenge for the atrocities of the German military through rape and plunder. What the German troops have done in the east is not revealed until the third part of the trilogy. Schober’s firstborn son, returning late from the war, shouts into the count’s face that he himself had ordered massacres before deserting his command. In retaliation for the killing of German soldiers, entire village populations—men, women and children—were shot as partisans.

One of the most powerful scenes in the first part includes the screening of a film recording the Americans troops’ liberation of Buchenwald concentration camp, which the Tannbach villagers are made to watch.

Further plot developments focus on the count’s daughter, Anna (Henriette Confusius), and Friedrich Erler, the working class youth from Berlin. The young couple fall in love, hoping to find fulfilment in a new and better world, where there are “no top and no bottom classes, and no more war”. Friedrich’s mother, Liesbeth Erler (Nadja Uhl), wants to escape the bad times, go to America and take her family with her.

Meanwhile the countess’s parents, former brewery owners from Zwickau and still fervent Nazis, have prepared their escape to Argentina via the “rat line” organised by Nazi operatives in collaboration with the Vatican, and want to take their granddaughter Anna with them, but she refuses.

“The Expropriation”

Part 2, “The expropriation”, deals with land reform in the Soviet occupation zone. Landowners who possess more than 100 hectares [247 acres] of property, or who were members of the Nazi party and committed war crimes, are expropriated without compensation. The land is then divided into five-hectare [12-acre] portions and allotted to the so-called “new farmers”.

Tannbach

The film fails to explain the brutal and reactionary role played by the Junkerclass (Prussian nobility) during the Wilhelmine Empire (1871-1918), the Weimar Republic (1919-1933) and the Nazis’ seizure of power in 1933. Instead, Count von Striesow (convincingly played by Lauterbach) is presented as not such a bad fellow, although he reacts to the expropriation of his estate as the worst injustice imaginable.

After returning from POW incarceration in France, he refuses to accept that his daughter Anna has married Friedrich and is working with him to cultivate five hectares of land allocated to them from the Striesow family’s former estate.

“My Land, Your Land”

In Part 3, “My Land, Your Land”, Anna and Friedrich are living on their small farm, which is barely capable of supporting them and their child. Lothar contributes significantly to the family’s subsistence through his cross-border smuggling and as a black marketeer.

Tannbach

Four years later, in 1952 and during the Cold War, the East German Stalinists build a fence running across the whole of Germany. It goes through the middle of Tannbach, which lies within the five-kilometre protected area behind the fence. The entire population of the eastern part of the village is subjected to stringent security regulations.

At this point, Liesbeth visits Tannbach from America, enthusiastically praising New York, where everyone can say what he or she wants and it “doesn’t matter whether anyone is a Jew or a Catholic”. She denies that anyone wants a new war: “You’ve all just talked yourselves into believing that.”

No one watching Tannbach would know that the US had just initiated a bloody war against North Korea and China that claimed three-four million lives, that the American ruling elite was engaged in the ferocious, anti-democratic McCarthyite witch-hunts and that African Americans were subject to brutal apartheid conditions in the US South.

A young East German border guard in Tannbach is shot and killed by West German border guards, leading to a tightening of border security. All people suspected of not being one hundred percent loyal to the Stalinist regime are forced to relocate away from the immediate border area or face prison if they oppose the evacuation order.

Tannbach

District administration head Konrad Werner, an idealistic communist, initially protects the Erlers when they are targeted for arrest by the increasingly powerful state security forces because of Lothar’s smuggling activities. Lothar is shot by East German border guards as he attempts to illegally cross the demarcation area to attend the baptism of Anna and Friedrich’s child, which Anna had requested be held at the village church in Tannbach’s western half.

After the baptism, Liesbeth remains in West Germany. The family is finally separated. District commissar Werner is removed to Berlin. He says goodbye to Friedrich with the words: “Working for a fairer world is not a bad idea, but unfortunately there’ll be setbacks along the way. New world orders take time to bring into existence.” Anna receives her downcast husband, Friedrich, with the words: “I’m proud of you. I believe in this here. I believe in everything we’ve built up here. This is our home”.

This open but scarcely credible conclusion is apparently designed to allow the filmmakers to claim they have presented the problems and perspectives of East and West Germans objectively.

Screenwriters Josephin and Robert Thayenthal write in response to the many questions raised by the period treated in the film, as follows: “Television won’t be able to give the answers, but perhaps it will give a sense of how people feel, think and act, how they develop and harden, how they behave under the threat of overwhelming power and in the grip of extreme fear”.

This statement points to the dilemma, faced by television productions, which claim simultaneously to entertain and educate. Communication of vague “feelings” is not enough for the understanding of history. Different viewpoints and experiences are juxtaposed, but they cannot be comprehended in any profound way because the socio-historical context has not been presented.

What Tannbach entirely leaves out, among other things, is the role of the parties responsible for the defeat of the German working class and the catastrophe of fascism, the Social Democrats and, above all, the Communist Party, which due to Stalin’s catastrophic policies, facilitated Hitler’s rise to power. Apart from coming to terms with the failure of the social revolution in Germany due to the crisis of working class leadership, no significant chapter of mid- and late-20th century German history can be profoundly understood.

Due to certain outstanding performances, the production offers numerous impressive scenes, but ultimately Tannbach is unsatisfactory. Compared to Edgar Reitz’s epic series Heimat: A Chronicle of Germany (1984), for example, the portrayal of the characters and their involvement in various events is often quite flat and unconvincing. But this is more the fault of the weak and limited script, and not so much that of the director, Alexander Dierbach, or the performers.

The documentary Spring 1945, broadcast by Arte on January 13, is far more successful in conveying an accurate picture of the immediate postwar period. The three episodes of Tannbach and related documentation are currently available in the ZDF media library.

 

http://www.wsws.org/en/articles/2015/02/09/tann-f09.html

“Better Call Saul” humanizes the smooth-talking “Breaking Bad” sidekick in a surprisingly solid spin-off

Vince Gilligan’s new antihero origin story has more in common with “Mad Men” than “Breaking Bad”

“Better Call Saul” humanizes the smooth-talking “Breaking Bad” sidekick in a surprisingly solid spin-off
Bob Odenkirk in “Better Call Saul” (Credit: AMC/Ursula Coyote)

I’m surprised how much I liked “Better Call Saul.” We might as well start there.

“Better Call Saul” isn’t exactly supposed to be good. It’s a spin-off of a beloved television show, “Breaking Bad”; and unlike “Friends” or “Cheers,” which both spawned spinoffs, “Breaking Bad” isn’t a feel-good sitcom with a happy ending. The five seasons of the original AMC show were a slow, brutal transformation story, from Walter White the man to Heisenberg the monster, and if the drug-dealing arc didn’t interest you, the incredible direction and once-in-a-lifetime performances might.

So when AMC announced the production of “Better Call Saul,” I was skeptical—not because I thought something from Vince Gilligan, Peter Gould and actor Bob Odenkirk couldn’t be good, but because I worried that the spin-off might tarnish the original. (I wish I could forget the “Star Wars” prequels. I wish I could.) The production decision is undoubtedly an attempt to make more money off of a successful franchise with an established fanbase—a situation that can privilege hacky fan service over quality and creativity. (Think “Joey,” the spinoff from “Friends,” as opposed to “Frasier,” the spinoff from “Cheers.”)

Vince Gilligan and his team, as usual, have surprised me. I haven’t totally fallen for the prequel series “Better Call Saul”—it doesn’t quite feel like its own show yet—but it did make me care about the man who becomes Saul Goodman in a way I never did in “Breaking Bad.” And though the story of Walter White is done and dead, series creators Gilligan and Gould have found a way to tell the story of Saul—currently known as Jimmy McGill, public defender—in a way that echoes and parallels White’s story without necessarily covering the same ground. The general premise is the same: The world makes it hard to be a good man (or a Goodman). But the sordid particulars will always vary.



When we meet Jimmy McGill—six years before the events of “Breaking Bad”—what’s fascinating about him is that he seems to know this already. Not exactly for himself, although his career has already brushed the wrong side of the law. But definitely for others. Jimmy makes ends barely meet by defending criminals in county court, where he is forced to come up with a narrative of explanation and redemption for possibly guilty defendants, multiple times a day. Jimmy’s a talker—that’s what he’s good at. That’s why he’s a lawyer, that’s what he brings to the table. But he’s not just a talker, he’s a storyteller of sorts: a salesman, a charlatan, an ad man. He’s got a plausible explanation for his clients’ many missteps, a ready tale of sympathy for anyone willing to listen—the judge, the jury, the prosecutor, the woman validating his parking. And though it sounds glib, it’s not effortless—we see him rehearse in mirrors, practice in his car, work through talking points before knocking on doors. He has to work up the energy to bluster. Maybe because he just wants to build momentum, and maybe because when you’re essentially a legal con man, you have to be careful to get your words right. But there’s a hint of something more tragic, too: Jimmy has to convince himself of the truth of his words so that he can have the most impact. He’s got to believe that his clients are innocent-ish in order to fight for them; he’s got to become the lie, or to become, more specifically, the most convenient version of the truth.

It’s there, in Jimmy McGill’s fast-talking attempt to come out on top, that “Better Call Saul” really shines. Despite being a spin-off of “Breaking Bad,” McGill has more in common with “Mad Men’s” Don Draper—not the womanizing or the mythos, but certainly that same fanatical commitment to selling a version of reality that both men end up half-believing, just to survive.

By the time we meet him in “Breaking Bad,” Bob Odenkirk’s Saul is a static figure—he’s part of the criminal environment that Walt and Jesse break into. His answers and advice are all world-weary and polished. “Better Call Saul” offers the viewer a chance to see how he would become that man. It’s more than a little convoluted—there’s a brother, a situation with a big law firm that is only explained in bits and pieces, a scheme gone wrong and the familiar landscape of the desert-suburbia of Albuquerque, shot with the same golden filters and wide angles. At times, the familiarity is exciting; at other times, it’s jarring. And the rest of the time, it’s vaguely frustrating—we’ve explored this landscape of abandoned strip malls, remote gas stations and cheap flip-phones before. There are a few familiar faces in the first three episodes; at least one made me roll my eyes. But there’s something a little delicious about the continuity, too: Spin-offs are the type of weird pop-culture artifact unique to serialized forms, and television in particular. It’s absurd and intriguing to see a master of the form take it on.

So for right now, I’m willing to go along with “Better Call Saul’s” smooth-talking appeal. Gilligan did masterful work with “Breaking Bad,” telling a story not just about Walter White but also about the culture that shaped and enabled him. Now he’s taking on another type of criminal—a trickster, not a mastermind. Jimmy McGill is very good at what he does, and as the first few episodes with him show, at least several years ago his heart was mostly in the right place. But he started to believe his own ready supply of lies, and that was the beginning of the end. You can’t talk your way out of the truth forever.

“Better Call Saul” premieres on AMC at 10 p.m. on Sunday, Feb. 8. The second episode will air at 10 p.m. on Monday, Feb. 9. The series will air on Mondays.

The Killing of America’s Creative Class

hqdefault

A review of Scott Timberg’s fascinating new book, ‘Culture Crash.’

Some of my friends became artists, writers, and musicians to rebel against their practical parents. I went into a creative field with encouragement from my folks. It’s not too rare for Millennials to have their bohemian dreams blessed by their parents, because, as progeny of the Boomers, we were mentored by aging rebels who idolized rogue poets, iconoclast cartoonists, and scrappy musicians.

The problem, warns Scott Timberg in his new book Culture Crash: The Killing of the Creative Class, is that if parents are basing their advice on how the economy used to support creativity – record deals for musicians, book contracts for writers, staff positions for journalists – then they might be surprised when their YouTube-famous daughter still needs help paying off her student loans. A mix of economic, cultural, and technological changes emanating from a neoliberal agenda, writes Timberg, “have undermined the way that culture has been produced for the past two centuries, crippling the economic prospects of not only artists but also the many people who supported and spread their work, and nothing yet has taken its place.”

 

Tech vs. the Creative Class

Timberg isn’t the first to notice. The supposed economic recovery that followed the recession of 2008 did nothing to repair the damage that had been done to the middle class. Only a wealthy few bounced back, and bounced higher than ever before, many of them the elites of Silicon Valley who found a way to harvest much of the wealth generated by new technologies. InCulture Crash, however, Timberg has framed the struggle of the working artist to make a living on his talents.

Besides the overall stagnation of the economy, Timberg shows how information technology has destabilized the creative class and deprofessionalized their labor, leading to an oligopoly of the mega corporations Apple, Google, and Facebook, where success is measured (and often paid) in webpage hits.

What Timberg glances over is that if this new system is an oligopoly of tech companies, then what it replaced – or is still in the process of replacing – was a feudal system of newspapers, publishing houses, record labels, operas, and art galleries. The book is full of enough discouraging data and painful portraits of artists, though, to make this point moot. Things are definitely getting worse.

Why should these worldly worries make the Muse stutter when she is expected to sing from outside of history and without health insurance? Timberg proposes that if we are to save the “creative class” – the often young, often from middle-class backgrounds sector of society that generates cultural content – we need to shake this old myth. The Muse can inspire but not sustain. Members of the creative class, argues Timberg, depend not just on that original inspiration, but on an infrastructure that moves creations into the larger culture and somehow provides material support for those who make, distribute, and assess them. Today, that indispensable infrastructure is at risk…

Artists may never entirely disappear, but they are certainly vulnerable to the economic and cultural zeitgeist. Remember the Dark Ages? Timberg does, and drapes this shroud over every chapter. It comes off as alarmist at times. Culture is obviously no longer smothered by an authoritarian Catholic church.

 

Art as the Province of the Young and Independently Wealthy

But Timberg suggests that contemporary artists have signed away their rights in a new contract with the market. Cultural producers, no matter how important their output is to the rest of us, are expected to exhaust themselves without compensation because their work is, by definition, worthless until it’s profitable. Art is an act of passion – why not produce it for free, never mind that Apple, Google, and Facebook have the right to generate revenue from your production? “According to this way of thinking,” wrote Miya Tokumitsu describing the do-what-you-love mantra that rode out of Silicon Valley on the back of TED Talks, “labor is not something one does for compensation, but an act of self-love. If profit doesn’t happen to follow, it is because the worker’s passion and determination were insufficient.”

The fact is, when creativity becomes financially unsustainable, less is created, and that which does emerge is the product of trust-fund kids in their spare time. “If working in culture becomes something only for the wealthy, or those supported by corporate patronage, we lose the independent perspective that artistry is necessarily built on,” writes Timberg.

It would seem to be a position with many proponents except that artists have few loyal advocates on either side of the political spectrum. “A working artist is seen neither as the salt of the earth by the left, nor as a ‘job creator’ by the right – but as a kind of self-indulgent parasite by both sides,” writes Timberg.

That’s with respect to unsuccessful artists – in other words, the creative class’s 99 percent. But, as Timberg disparages, “everyone loves a winner.” In their own way, both conservatives and liberals have stumbled into Voltaire’sCandide, accepting that all is for the best in the best of all possible worlds. If artists cannot make money, it’s because they are either untalented or esoteric elitists. It is the giants of pop music who are taking all the spoils, both financially and morally, in this new climate.

Timberg blames this winner-take-all attitude on the postmodernists who, beginning in the 1960s with film critic Pauline Kael, dismantled the idea that creative genius must be rescued from underneath the boots of mass appeal and replaced it with the concept of genius-as-mass-appeal. “Instead of coverage of, say, the lost recordings of pioneering bebop guitarist Charlie Christian,” writes Timberg, “we read pieces ‘in defense’ of blockbuster acts like the Eagles (the bestselling rock band in history), Billy Joel, Rush – groups whose songs…it was once impossible to get away from.”

Timberg doesn’t give enough weight to the fact that the same rebellion at the university liberated an enormous swath of art, literature, and music from the shadow of an exclusive (which is not to say unworthy) canon made up mostly of white men. In fact, many postmodernists have taken it upon themselves to look neither to the pop charts nor the Western canon for genius but, with the help of the Internet, to the broad creative class that Timberg wants to defend.

 

Creating in the Age of Poptimism

This doesn’t mean that today’s discovered geniuses can pay their bills, though, and Timberg is right to be shocked that, for the first time in history, pop culture is untouchable, off limits to critics or laypeople either on the grounds of taste or principle. If you can’t stand pop music because of the hackneyed rhythms and indiscernible voices, you’ve failed to appreciate the wonders of crowdsourced culture – the same mystery that propels the market.

Sadly, Timberg puts himself in checkmate early on by repeatedly pitting black mega-stars like Kanye West against white indie-rockers like the Decembrists, whose ascent to the pop-charts he characterizes as a rare triumph of mass taste.

But beyond his anti-hip-hop bias is an important argument: With ideological immunity, the pop charts are mimicking the stratification of our society. Under the guise of a popular carnival where a home-made YouTube video can bring a talented nobody the absurd fame of a celebrity, creative industries have nevertheless become more monotonous and inaccessible to new and disparate voices. In 1986, thirty-one chart-toppers came from twenty-nine different artists. Between 2008 and mid-2012, half of the number-one songs were property of only six stars. “Of course, it’s never been easy to land a hit record,” writes Timberg. “But recession-era rock has brought rewards to a smaller fraction of the artists than it did previously. Call it the music industry’s one percent.”

The same thing is happening with the written word. In the first decade of the new millennium, points out Timberg, citing Wired magazine, the market share of page views for the Internet’s top ten websites rose from 31 percent to 75 percent.

Timberg doesn’t mention that none of the six artists dominating the pop charts for those four years was a white man, but maybe that’s beside the point. In Borges’s “Babylon Lottery,” every citizen has the chance to be a sovereign. That doesn’t mean they were living in a democracy. Superstars are coming up from poverty, without the help of white male privilege, like never before, at the same time that poverty – for artists and for everyone else – is getting worse.

Essayists are often guilted into proposing solutions to the problems they perceive, but in many cases they should have left it alone. Timberg wisely avoids laying out a ten-point plan to clean up the mess, but even his initial thrust toward justice – identifying the roots of the crisis – is a pastiche of sometimes contradictory liberal biases that looks to the past for temporary fixes.

Timberg puts the kibosh on corporate patronage of the arts, but pines for the days of newspapers run by wealthy families. When information technology is his target because it forces artists to distribute their work for free, removes the record store and bookstore clerks from the scene, and feeds consumer dollars to only a few Silicon Valley tsars, Timberg’s answer is to retrace our steps twenty years to the days of big record companies and Borders book stores – since that model was slightly more compensatory to the creative class.

When his target is postmodern intellectuals who slander “middle-brow” culture as elitist, only to expend their breath in defense of super-rich pop stars, Timberg retreats fifty years to when intellectuals like Marshall McLuhan and Norman Mailer debated on network television and the word “philharmonic” excited the uncultured with awe rather than tickled them with anti-elitist mockery. Maybe television back then was more tolerable, but Timberg hardly even tries to sound uplifting. “At some point, someone will come up with a conception better than middlebrow,” he writes. “But until then, it beats the alternatives.”

 

The Fallacy of the Good Old Days

Timberg’s biggest mistake is that he tries to find a point in history when things were better for artists and then reroute us back there for fear of continued decline. What this translates to is a program of bipartisan moderation – a little bit more public funding here, a little more philanthropy there. Something everyone can agree on, but no one would ever get excited about.

Why not boldly state that a society is dysfunctional if there is enough food, shelter, and clothing to go around and yet an individual is forced to sacrifice these things in order to produce, out of humanistic virtue, the very thing which society has never demanded more of – culture? And if skeptics ask for a solution, why not suggest something big, a reorganization of society, from top to bottom, not just a vintage flotation device for the middle class? Rather than blame technological innovation for the poverty of artists, why not point the finger at those who own the technology and call for a system whereby efficiency doesn’t put people out of work, but allows them to work fewer hours for the same salary; whereby information is free not because an unpaid intern wrote content in a race for employment, but because we collectively pick up the tab?

This might not satisfy the TED Talk connoisseur’s taste for a clever and apolitical fix, but it definitely trumps championing a middle-ground littered with the casualties of cronyism, colonialism, racism, patriarchy, and all their siblings. And change must come soon because, if Timberg is right, “the price we ultimately pay” for allowing our creative class to remain on its crash course “is in the decline of art itself, diminishing understanding of ourselves, one another, and the eternal human spirit.”

 

http://www.alternet.org/news-amp-politics/killing-americas-creative-class?akid=12719.265072.45wrwl&rd=1&src=newsletter1030855&t=9

He’s not suddenly Paul Krugman: Let’s not morph Obama into Elizabeth Warren quite yet

Populist State of the Union with a fiery tone has liberals excited. They’d be wise to remember Obama’s true nature

He's not suddenly Paul Krugman: Let's not morph Obama into Elizabeth Warren quite yet
Paul Krugman, Barack Obama, Elizabeth Warren (Credit: AP/Reuters/Bob Strong/Junko Kimura-Matsumoto/Charles Dharapak/Photo montage by Salon)

Barack Obama’s State of the Union speech capped an epic political makeover. In two months he went from the living avatar of the political and economic establishment to a self-styled populist scourge. It’s as if he walked into a plastic surgeon’s office after Election Day and said “make me look like Bernie Sanders.” No president has ever tried to alter his image so drastically or so fast. I wonder if he’ll pull it off.

His campaign began emphatically on Nov. 5. Instead of the ritual submission the media demands of defeated party leaders, Obama used his post-election press conference to renew his vow to enact substantial immigration reform by executive order. Days later, he announced a major climate accord with China and finally came down foursquare for net neutrality.

These were big moves, but Obama was just warming up. In December, he announced the surprising end of our miserably failed Cuban trade embargo. Earlier this month, he unveiled a bold bid to make community college free for millions of students all across America.

Still not impressed? On Tuesday night he called for paid family leave, equal pay for equal work, a minimum wage hike and a tripling of the child tax credit to $3,000. He’s also pushing a $500 “second earner” tax credit and wants to give college students up to $2,500 apiece to help with expenses. The best part is how he’d pay for it all, mostly by taxing big banks, raising capital gains rates and closing loopholes that allows rich heirs to avoid capital gains taxes altogether.

A not-so-subtle shift in tone followed. Gone, for now, is Obama the ceaseless appeaser. He’s been replaced by a president with a more combative stance, as befits a true people’s champion. At times on Tuesday Obama even seemed to taunt his tormentors. In the last two months he has threatened five vetoes. In the previous six years he’d issued just two; that’s the fewest since James Garfield. Garfield, by the way, was president for six months.

What should we make of this new Obama? Are he and his new agenda for real? For liberals, these are tender questions. When Obama first appeared, their response was almost worshipful. Even today, many liberals treat Obama’s progressive critics as apostates. Given their deep investment in him, the vitriol of Tea Party attacks and the looming specter of GOP rule, it’s easy to understand why. But it’s crucial now for his liberal critics and defenders alike to see him as he is.



Obama’s new program seems real enough. We can’t gauge its full impact without more numbers, but this much is clear: Do it all — equal pay, minimum wage hike, community college tuition, family leave, middle-class tax credits and taxes on big banks and the superrich — and we’d make a very big dent in income inequality. Add the financial transaction tax Ralph Nader and Rose Ann DeMoro’s California nurses have long been pushing — and that some House Democrats now embrace — and you have enough money on the table to reverse decades of wage stagnation.

It may seem a big claim but the numbers are close to consensual. The transaction tax would raise a trillion dollars in 10 years, in which time a modest minimum wage hike would put $300 billion in the pockets of the working poor. Equal pay for equal work could do as much. Even without Obama’s numbers, we know the ideas gaining ground among Democrats could solve one of our biggest problems. As the president said apropos of just about everything, “this is good news, people.”

So what’s not to like? The bad news is there’s quite a bit. The problem is that Obama’s deeds so often contradict his words. Indeed, examine his actions over these same two months and one could also construct a compelling counter-narrative to this tale of populist transformation.

Consider climate change. While negotiating his China deal, Obama was also busy auctioning off drilling rights to 112 million acres of the Gulf of Mexico. As soon as the deal was done, he was on the phone urging Democrats to back a bill that cut EPA staff, let the Export-Import Bank fund coal-fired electric plants and blocked enforcement of new rules for energy-efficient light bulbs.

In his first term Obama passed the word to his top hires to quiet down about global warming. He likes fracking and brags about increasing oil production. He won’t let Congress approve the Keystone pipeline, but he may approve it himself. In short, he’s a study in mixed climate messages.

The net neutrality story is even more confounding. The statement Obama released was one of the more thoughtful of his presidency. But he’d already made Tom Wheeler, CEO of the most powerful lobby opposing net neutrality, head of the Federal Communications Commission. And they decide the issue. It’s an independent commission that does what it wants. Its members may be moved by Obama’s eloquent words, or just confused.

Perhaps the most troubling contradiction lies in foreign policy. Obama began his speech on Tuesday by saying “tonight we turn the page.” As evidence he cited our newly reduced role in Afghanistan. As he put it: “For the first time since 9/11, our combat mission in Afghanistan is over. Six years ago, nearly 180,000 American troops served in Iraq and Afghanistan. Today fewer than 15,000 remain. And we salute the courage and sacrifice of every man and woman… who has served to keep us safe.”

Obama’s relative restraint is such an improvement on George W. Bush’s bellicosity that we can’t help but judge him on a curve. That he’s bogged down in Afghanistan is no surprise, as these wars are always easier to start than finish. (It’s why they call them quagmires.) But in fact there are more than 15,000 Americans still left there. There are, for instance, the private contractors, whose number tripled under Obama. In early 2014, the last time figures were reported, there were 24,000. Obama says the “combat mission” is over — but the combat isn’t finished and neither is the mission.

On Wednesday, Mother Jones ran a story by Nick Turse of TomDispatch.com reporting that in 2014 Obama deployed U.S. Special Ops forces to 133 countries. That’s more than two-thirds of all the countries in the world; it’s a disturbing number and one that also grew exponentially on Obama’s watch. Even more disturbing are the drone strikes Obama has authorized, more than 10 times the number authorized by George W. Bush. American drones have now killed an estimate of more than 4,000 people. At least 20 percent of them were innocent civilians; less than 2 percent were high-value military targets.

In case you thought our combat mission in Iraq ended, buried in Obama’s speech was a call for Congress to pass a “resolution to authorize the use of force against ISIL.” That was it — no explanation of vital interests at stake or limits to set. It was strange coming from a man who wouldn’t be president but for a speech he once gave against a war into which we were tragically conned.

Our war with ISIL proceeds under cover of our original Iraq war resolution, the exhaustion of which Obama concedes by implication. Someone should tell him the same resolution is used to justify drone strikes in nations we’re not at war with. Someone might also mention that use of “private security contractors” — the word “mercenary” stirs indignation — ill befits a democracy; that sending special ops forces to 133 countries also requires authorization and that if you declare an end to combat operations in two wars, your next budget should declare a peace dividend.

Obama’s failure to reconcile words to deeds detracts mightily from the grab bag of ideas he offers under the catchy title “middle class economics.” As noted, these policies could really improve people’s lives. But while he’s out thumping for them, he’s in hot pursuit of what he hopes will be his last coup, approval of the Trans Pacific Trade Partnership. It’s such a popular idea he chose not to breathe its name in his speech. What he did say was worth sampling if only to savor its cleverness: “China wants to write the rules for the world’s fastest-growing region. We should write those rules… That’s why I’m asking both parties to give me trade promotion authority to protect American workers, with strong new trade deals from Asia to Europe that aren’t just free, but fair.”

He doesn’t want another free trade fiasco like that awful NAFTA, just “trade promotion authority to protect American workers.” Surely we can all be for that.

Nearly all left-leaning Democrats oppose the TPTP: Paul Krugman, Joe Stiglitz, Bob Reich, Elizabeth Warren. One can’t imagine Obama changing his mind on it any more than one imagines him asking any of them to help craft his new populist agenda. As he likes to reassure his donors, “I’m a market kind of guy,” meaning he comes as close as a Democrat can to being a market ideologue. And yes, there is such a thing.

Market ideologues aren’t the sort to throw bombs or ruin dinner parties but they’re ideologues nonetheless. Their solution for every problem known to mankind is to adopt “market principles.” Their influence on Obama’s generation of Democratic elites has been profound. It’s why so many of them apply market theory to issues to which it is ill-suited, such as carbon reduction, health care and public education.

Obama doesn’t get that free trade can be as good as he says for business and still be a terrible deal for workers. He doesn’t get that markets by their nature do a great job of creating wealth and a poor one of distributing it; that absent a strong government to encode and enforce a social contract there is no middle class; that pitting our workers against those lacking such support will eventually impoverish them. It’s why he opposed raising the minimum wage when he had the votes to do it in his first term. It’s why he bailed out banks but not homeowners, and abandoned the public option.

Missing from Obama’s speech, as from his presidency, was any mention of public corruption. Countless polls attest to the depth of public revulsion at the domination of government by moneyed interests. Obama’s silence allows the Tea Party to fly the flag of “crony capitalism.” Most progressives miss the criticality of this issue that social change movements the world over put at the very top of their agendas.

It makes it really hard to enact new government programs, which is one reason Obama didn’t propose any new federal programs, just tax cuts, private sector mandates and grants to states. There are things the federal government does better, but voters won’t hand over the keys to a car with a cracked engine block. A real populist would fix what we all know is broken.

Betting on what a politician truly thinks is a high-risk business. Some say Obama has changed. Perhaps so; maybe a friend gave him a Krugman book for Christmas and midway through it he had an epiphany. Others say he feels liberated; that’s a popular hope among liberals in that it implies he really did love them all along. Still others say he wants to shape his legacy or the next debate.

But in studying Obama, one discovers a man of markedly fixed views. His take on issues has barely budged over a lifetime. Once he sets a course he sticks to it. We saw it in 2008 when Hillary Clinton rose from the dead sporting a new populist persona. It surprised many to see her peddling her wares to the working class. It shocked them when she won the Pennsylvania primary. John McCain shocked some by running even with him up until the Wall Street crash. We don’t know if either shocked Obama, but we do know he never once changed course.

On Tuesday he devoted an astonishing 20 percent of his time not to global warming or “middle-class economics” but to a defense of his 10-year pursuit of the holy grail of bipartisanship. For six years Obama played Charlie Brown to the Republicans’ Lucy in budget battles. In December he took another crack at the football. Is his new populism such a far cry from his 2008 rhetoric of transformation, or just a bit more specific to satisfy the hunger still rising for change? Do we really think it arose from somewhere other than the usual focus groups and polls?

There’s good news in all this. Someone changed, and if it isn’t Obama it must be us. It isn’t any politician but the power of public opinion that drives this debate. Republicans feel it. Hearing just an outline of a populist message scares them. Pundits say they won’t pass any part of Obama’s agenda but if they’re smart they will; perhaps a lesser minimum wage hike and something just for women. But we’ll never win the victory we must win without a strong progressive movement because neither this system nor those who run it will ever really change.

Bill Curry was White House counselor to President Clinton and a two-time Democratic nominee for governor of Connecticut. He is at work on a book on President Obama and the politics of populism.

 

http://www.salon.com/2015/01/25/hes_not_suddenly_paul_krugman_lets_not_morph_obama_into_elizabeth_warren_quite_yet/?source=newsletter

Jim Rockford Warned Us About Google And Facebook Back In 1978

Why didn’t we listen? The fourth season of The Rockford Files, arguably the greatest television show of all time, features a “futuristic” storyline about a terrible threat. What if a private corporation used computers to gather personal information on hundreds of millions of Americans? Could we trust them with that data?

I know, it’s hard to imagine such a thing ever happening — a private company, collecting private and personal data on ordinary Americans and other people around the world. It sounds far-fetched, right? But Jim Rockford, the toughest and most incorruptible P.I. ever to live in a trailer with his dad, teams up with a younger detective to investigate the suspicious death of an old friend, a private detective named Tooley, in the episode “The House on Willis Avenue.” (This episode is written by the show’s co-creator, Stephen J. Cannell, who also gave us The Greatest American Hero.)

And what Rockford finds in his investigation is baffling — a mysterious set of real estate developments, with lots of suspiciously huge air-conditioning units attached. What’s going on? Turns out that a corporate scumbag, amusingly played by Jackie Cooper, is creating a secret computer system to spy on ordinary Americans and sell the info — or ruin your reputation — for profit. It should be illegal for corporations to spy on ordinary Americans, Rockford protests. You can see the highlights above.

It all leads up to this solemn cue card at the very end of the episode:

Jim Rockford Warned Us About Google And Facebook Back In 1978

The Rockford Files really wants you to know that corporations should not use computers to collect your personal information. Those final words, “Our liberty may well be the price we pay,” seem especially prophetic nowadays.

The other amazing part of the episode is all the parts where Rockford and his temporary sidekick pretend to be computer experts, and spout ridiculously made-up computer jargon, to try and fool people in the facilities they’re sneaking into. Here are the two best examples of that:

http://www.viddler.com/embed/21c50871/?f=1&autoplay=false&player=mini&disablebranding=0;offset=0&autoplay=0

http://io9.com/jim-rockford-warned-us-about-google-and-facebook-back-i-1681231028

Follow

Get every new post delivered to your Inbox.

Join 1,652 other followers