1987 is the most important year in alternative rock

From chart hits to mainstream breakthroughs, it was the year modern rock came into its own

Why 1987 remains the most important moment in alternative rock
The Psychedelic Furs; Depeche Mode; Echo & the Bunnymen(Credit: Legacy/Sire)

The National has achieved many things in its career: music festival headlining slots, Grammy nominations and near-chart-topping albums. However, the brooding, Brooklyn-via-Cincinnati band hasn’t had a No. 1 single — until now.

Billboard reports “The System Only Dreams in Total Darkness,” a song from the National’s forthcoming “Sleep Well Beast” album, reached the top slot of the Adult Alternative Songs chart for the week of August 19, beating Arcade Fire’s “Everything Now” by a measly two spins.

In addition to the National and Arcade Fire, the rest of the Adult Alternative Songs chart top 10 includes a slew of seasoned artists: Portugal. The Man, Spoon, The War on Drugs, The Killers and Jack Johnson. Sonically, these acts don’t overlap much. However, on an aesthetic level, they all promulgate a musical approach predicated on constant metamorphosis.

The Killers’ “The Man” is an icy, funky strut; the War on Drugs’ “Holding On” is rife with sparkling Springsteenisms; Portugal. The Man’s “Feel it Still” is a taut, soulful shimmy. The biggest chameleons might be Spoon, whose latest effort is the ominous, funky “Can I Sit Next to You,” which feels like Duran Duran filtered through a paper shredder and pieced back together.

In a striking parallel, the composition of the Adult Alternative Songs chart — notably the abundance of veteran bands who are fearless about evolution — echoes the equally transformative alternative bands dotting 1987’s music landscape.

That’s not necessarily surprising: 1987 was an enormously influential year that shaped how fans and artists alike create, consume and appreciate so-called modern or progressive music.

To understand why 1987 is a cultural inflection, it’s best to consider it the year a burgeoning underground movement crystallized and mobilized. Certain facets of this movement were already in place, of course. Specialty national video shows such as MTV’s “120 Minutes” and “I.R.S. Records Presents The Cutting Edge” and USA’s “Night Flight,” as well as regional video shows (V66 in Boston and MV3 in Los Angeles) were already airing clips from new wave and so-called “college rock” bands. Modern rock-leaning radio stations — notably KROQ in Los Angeles and the Long Island powerhouse WLIR — were also giving these new groups a platform.

On a more mainstream level, John Hughes-associated movies such as 1984’s “Sixteen Candles” and 1986’s “Pretty in Pink” combined relatable depictions of teen angst with a cool-mixtape musical vibe. Hughes treated bands such as Thompson Twins, New Order, OMD and the Psychedelic Furs like futuristic pace-setters. The people responded in kind.

In 1986, OMD’s “If You Leave,” from Hughes’ “Pretty in Pink” peaked at No. 4 on the U.S. pop charts. No wonder critic Chris Molanphy, writing in Maura Magazine, points to “how pivotal Hughes was in helping to break what became known as alternative rock in America — he served as a bridge between what was known in the first half of the ’80s as postpunk or new wave and what would be called alt-rock or indie rock by the ’90s.”

Hughes’ imprint reverberated well beyond films. For example, the movie “Pretty in Pink” took its title from the Psychedelic Furs song of the same name. Appropriately, the U.K. band re-cut the tune, which originally appeared on 1981’s “Talk Talk Talk,” for the film’s 1986 soundtrack. This slicker new version landed just outside the top 40, at No. 41. However, the goodwill earned by this re-do buoyed the Furs through 1987: The desperate swoon “Heartbreak Beat,” the lead single from the band’s 1987 LP, “Midnight to Midnight,” became the Furs’ only U.S. top 40 hit, peaking at No. 26 in May.

“Midnight to Midnight” polarized fans: A collection of full-on synth-pop gloss, it bears little resemblance to the group’s early, moody post-punk. Yet bold evolutions were a 1987 trend; multiple established modern and indie bands staked a decidedly contemporary claim, sometimes in ways that completely overhauled (or at least added intriguing new dimensions to) their previous sounds.

(Let the record show that this phenomenon also has precedent: For example, Scritti Politti’s glittering synth-pop gush “Perfect Way,” which reached the top 15 in 1986, is a far cry from the band’s scabrous post-punk roots.)

Elsewhere in 1987, Echo & The Bunnymen buffed up their gloom on a self-titled album with sharper production, while Depeche Mode countered with “Music for the Masses,” an (appropriately) massive-sounding record with a dense, industrial-synth sound. The Replacements, meanwhile, teamed up with producer Jim Dickinson for “Pleased to Meet Me,” their most streamlined and focused rock record yet. R.E.M. forged a production partnership with Scott Litt that would stretch into the ’90s, releasing the loud-and-proud political statement “Document.”

In many cases, these evolutions didn’t necessarily lead to immediate commercial dividends. In fact, the Smiths — inarguably one of the biggest cult alternative acts in the U.S. — broke up in 1987, making their forward-sounding final album, “Strangeways Here We Come,” a posthumous swan song. However, in 1987, the upper reaches of the pop charts were noticeably more amenable to modern bands.

 Consider this a culmination of a slow and steady trend — how the chart inroads made by OMD and the Psychedelic Furs paired with those made by Pet Shop Boys (“West End Girls” hit No. 1 in 1986) and INXS (who set the stage for its blockbuster 1987 record “Kick” with 1985’s top 5 smash “What You Need”).

“Just Like Heaven,” from 1987’s “Kiss Me, Kiss Me, Kiss Me,” became The Cure‘s first U.S. mainstream top 40 chart hit, peaking at No. 40 in 1988. The shimmering 1987 synth-pop gem “True Faith” also became New Order’s first top 40 single, landing at No. 32. Other bands found even greater success: Midnight Oil’s “Diesel and Dust” spawned that band’s first mainstream hit, the top 20 entry “Beds Are Burning,” while R.E.M. landed its first top 10 single with “The One I Love” from “Document.” Los Lobos’ cover of “La Bamba” hit No. 1 (though having a major Hollywood movie behind them helped tremendously).

What’s interesting: Besides individual radio station charts and specialized trade magazines, these alternative acts didn’t yet have a dedicated Billboard chart. The publication only launched its Modern Rock Tracks chart on Sept. 10, 1988, “in response to industry demand for consistent information on alternative airplay,” as it noted in that week’s issue. In hindsight, it’s easy to see this chart as a reaction to 1987’s alternative groundswell. The influence of these groups was now impossible to ignore, and measuring their reach and impact — no doubt crucial for label bean counters, if nothing else — made sense.

In an interesting twist, 1987’s beginnings and endings were as formative as their transformations. That year’s dissolution of the Smiths and Hüsker Dü led to each band’s frontman — Morrissey and Bob Mould, respectively — launching fruitful and vibrant solo careers that endure today. Debut records from Jane’s Addiction (a self-titled effort) and Pixies (“Come On Pilgrim”), and second albums from Dinosaur Jr (“You’re Living All Over Me”) and Faith No More (“Introduce Yourself”) put forth an aggressive, hybridized rock sound that presaged ’90s grunge, metal and punk. Even Crowded House’s 1986 debut LP finally spawned two hits in 1987, “Don’t Dream It’s Over” and “Something So Strong.”

And, in terms of the touring circuit, plenty of popular 1987 acts continue to find success; U2 playing 1987’s “The Joshua Tree” to packed stadiums is the most obvious one. Depeche Mode is currently embarking on an amphitheater tour while Echo & the Bunnymen and Violent Femmes have toured sheds together all summer. In spring 2017, Psychedelic Furs tapped Robyn Hitchcock as an opener. In the fall, the band is teaming up with Bash & Pop, featuring the Replacements’ Tommy Stinson, for tour dates.

These tour dates in particular have led several writers to recently question whether “80s pop” or “classic alternative” could become the new classic rock. It’s an intriguing idea, although one the radio consultants at Jacobs Media doubt has traction.

“While these bands may do well at state fairs and other summer festivals boasting well-stocked lineups of bands, their ability to support a format is questionable,” Fred Jacobs wrote in a recent blog post. “Classic Rock — and its derivatives — as well as Oldies stations were predicated on the power of nostalgia — not just for a few thousand fans in a market, but for tens of thousands or more of die-hard supporters. We’re talking mass appeal vs. niche.”

Jacobs then went on to point out that Echo & The Bunnymen received only seven spins on a Classic Alternative station in a recent week. “It’s hard to create a groundswell of support for poorly exposed music that’s now 30+ years old,” Jacobs adds.

In a sense, current successful bands like Arcade Fire, Spoon and the National are better positioned than their 1987 analogs to avoid this trap. Multiple channels — radio, video, streaming, live shows — make it easier for bands to gain exposure and reach more people.

At the same time, 2017’s fractured musical culture means that there are plenty of people who either don’t listen (or don’t need to listen) to any of these bands. For proof, just look at the puzzled reactions to Arcade Fire nabbing the Album of the Year Grammy in 2011. One person’s mainstream band is another’s niche or unknown act. Perhaps the underlying concept that drove alternative music culture’s 1987 rise — the mainstream cracking the door open to outsiders — is still alive and well in 2017.

Annie Zaleski is a Cleveland-based journalist who writes regularly for The A.V. Club, and has also been published by Rolling Stone, Vulture, RBMA, Thrillist and Spin.

Trump is the ultimate fulfillment of consumer capitalism

Trump embodies the triumph of spectacle over reality more than any previous president, but he’s no anomaly

As Donald Trump’s inability to govern has become increasingly evident over the past six months, the White House has essentially transitioned into a full-blown reality TV show, with enough melodrama and petty infighting to fill several seasons worth of primetime network television.

The president, it seems, has given up all pretense of sanity as his administration has spiraled out of control. He now appears to approach his current job of running the United States government in the same way that he approached his career as a reality TV star. Top officials in the Trump administration have become virtual contestants, vying for the affection of their capricious boss and hoping he won’t mention their names in his next unhinged Twitter rant.

This transition into an dysfunctional reality TV show came to a head two weeks ago, when the president hired Anthony “The Mooch” Scaramucci, the cartoonish and vainglorious Wall Street investor, as his communications director. Like a fame-hungry contestant on “The Apprentice,” the foul-mouthed financier wasted no time in marking his territory and attacking his fellow sycophants, calling then-White House chief of staff Reince Priebus a “paranoid schizophrenic,” while threatening to fire his entire staff.

By the end of his first week, Priebus had been forced out, Scaramucci’s wife had filed for divorce, and then, on Monday, “The Mooch” himself was eliminated from the Trumpian Thunderdome also known as the White House.

As all this drama unfolded, Trump’s agenda took yet another blow with the implosion of the Republican health care bill in the Senate, leaving the president with no major policy achievement to speak of in his first six months in office. Though Trump has repeatedly claimed to have accomplished more than any of his predecessors in his first months in office, the truth is that he has overseen the most incompetent and amateurish administration in modern history. As Ryan Cooper recently put it in The Week, “the hapless incompetence of this administration is virtually impossible to exaggerate.”

The president’s first six months have confirmed what many people already knew: Trump’s image as a savvy and smart businessman with an extraordinary deal-making ability is a complete sham: the president didn’t know the first thing about running a government when he ran for office. The New York billionaire (if he is indeed a billionaire) has spent his entire adult life carefully cultivating his image as a masterful deal-maker and builder, plastering his name onto anything and everything (including many properties that he does not own) and greatly exaggerating his net worth. Trump has always been more spectacle than substance, and like a used car salesman rolling back the odometers, he made countless promises during his campaign (he would repeal and replace Obamacare “on day one,” for instance) without any real plan on how to fulfill these promises. Just like his career, Trump’s campaign was all spectacle, no substance.

Not surprisingly, then, as Trump’s true nature has become more apparent and his incompetence on full display, the spectacle surrounding his White House has only grown more outrageous. Like a Ponzi-scheme operator whose promised returns become more ridiculously bullish as investors flee and the coffers drain, the president’s rhetoric has become more grandiose and detached from reality as his presidency has gone off the rails. One can expect the circus to grow more preposterous still as the Trump administration continues to implode.

For many Americans, the spectacle will always be enough; whether or not Trump is ever successful in terms of policy, the image he projects on television screens will continue to convince millions. It is comforting to think of our reality TV president and his political rise as some kind of anomaly, but that’s not true. Donald Trump is a product of late capitalism, and the spectacle will continue to dominate in a world where all aspects of life have been commodified and each person has become just another customer.

In his classic work “The Society of the Spectacle,” published 50 years ago, French theorist Guy Debord expounded on what he called the “spectacular society,” in which the modern capitalist mode of production “presents itself as an immense accumulation of spectacles.” The society of the spectacle, postulated the founder of the political-artistic collective known as the Situationist International, had developed over the 20th century with the rise of mass media and the commodity’s “colonization of social life.”

“Understood on its own terms,” wrote Debord in his aphoristic style, “the spectacle proclaims the predominance of appearances and asserts that all human life, which is to say all social life, is mere appearance . . . In all its specific manifestations — news or propaganda, advertising or the actual consumption of entertainment — the spectacle epitomizes the prevailing model of social life.”

Half a century after Debord published his influential treatise, the society of the spectacle has given rise to a president who epitomizes the prevailing model of social life, where appearances often predominate over reality.

“In a world that really has been turned on its head,” observed Debord, “truth is a moment of falsehood.” One could be forgiven for assuming that he was describing our world today.

Conor Lynch is a writer and journalist living in New York City. His work has appeared on Salon, AlterNet, Counterpunch and openDemocracy. Follow him on Twitter: @dilgentbureauct.

The American nightmare: the career of moviemaker George Romero

Nicole Colson looks at the career of moviemaker George Romero, who died July 16.

“The tradition of all dead generations weighs like a nightmare on the brains of the living.”
— Karl Marx

“They’re coming to get you, Barbara.”
— Johnny, Night of the Living Dead

Zombies on the hunt in Night of the Living Dead

Zombies on the hunt in Night of the Living Dead

THE “THEY” that were coming for poor Barbara–and for us in the audience–changed frequently from film to film over the course of horror-master George Romero’s career. Racism, consumer culture, Reagan-era militarism, the ruling class–all were fair game for the father of the modern zombie movie, who died on July 16 after a career in film spanning nearly 40 years.

Romero’s best and most-enduring film was his first–1968’s Night of the Living Dead. It was made by a group of friends and shot on a shoestring budget outside of Pittsburgh in just 30 days, with the cast and crew working in 24-hour shifts. Friends played the zombies, and the meat and entrails needed for the gore were helpfully provided by a friend who was a butcher.

Night of the Living Dead tells the story of a group of survivors of a zombie outbreak seeking shelter in a farmhouse–catatonic Barbara, whose brother Johnny is killed in the opening scene, squabbling parents Harry and Helen and their young daughter Karen, young couple Tom and Judy, and protagonist Ben. [Editor’s note: Spoilers abound throughout this article, but it’s your fault if you haven’t seen these movies already.]

The film masterfully ratchets up the tension as the survivors fall victim to the zombies, before ending on a note that is both shocking and beautifully bleak, and stands as not only a masterpiece of modern horror, but one of the starkest commentaries on racism in American film.

While zombies had been depicted in American film before Night of the Living Dead, they were, essentially, not monsters but slave labor. Romero’s genius was in taking zombies and turning them into a slow, inexorable, mindless force bent only on feeding–a force that he would use as an allegorical foil in multiple films.

– – – – – – – – – – – – – – – –

NIGHT OF the Living Dead was one of a number of horror movies from the late 1960s and ’70s that were shaped by the massive upheavals taking place in American society.

Romero and other horror icons, like Wes Craven and Tobe Hooper, who emerged at the time transmitted on screen what they saw in the world: the violence of police, dogs and fire hoses being used against civil rights marchers; city streets burning in the wake of urban rebellions; body bags and the escalating horrors of Vietnam.

The monstrous was already taking place in real America every day, and making its way into living rooms on the evening news. It was only natural that this should find its way onto the big screen as well.

As Romero commented in the 2000 documentary The American Nightmare, explaining the choice to shoot his first movie in black and white, “In those days, the news was in black and white, and black and white was the medium…I thought it was great, you know, this idea of a revolution…a new society devouring the old completely and just changing everything.”

“Obviously,” Romero added, “what’s happening in the world creeps into any work.”

In contrast to the multitude of zombies in horror today–from the Walking Dead to World War Z, the endless iterations of Romero’s own Living Dead series, and even a genre of “zombie romantic comedies”–Night of the Living Dead was the first time that an act of cannibalism had been portrayed on a U.S. movie screen.

The scene of a young child eating her parents was shocking for what it said about a society so at odds with itself, as evidenced by the racist violence used against civil rights activists and the increasing numbers of young men returning home from Vietnam in body bags or still alive, but profoundly changed by the horrors of war.

In fact, the look of Romero’s 1978 Dawn of the Dead owes much to effects artist Tom Savini, who used his experiences as a combat photographer in Vietnam as inspiration.

While Romero didn’t originally intend for Night of the Living Dead to be a commentary on race and racism, the casting of Black actor Duane Jones as protagonist Ben makes it impossible to not see the film this way. Because of Jones, scenes in the film evoke lynchings, the terror of the Klan and the civil rights movement.

That Ben is authoritative and seemingly in control throughout–slapping Barbara out of hysterics at one point and telling Harry to “Get the hell down in the cellar. You can be the boss down there. I’m boss up here”–only underscores the gut-punch of the casual brutality of his death. It is one of the most emotionally devastating endings in modern film, not just horror.

In a twist of fate, Romero and his collaborators finished the film and loaded it in a car trunk to take to New York to find a distributor on April 4, 1968–the day Martin Luther King Jr. was assassinated.

– – – – – – – – – – – – – – – –

PART OF the brilliance of Night of the Living Dead is the way it builds tension through demolishing movie conventions–the ostensible heroine becomes catatonic for virtually the entire film, only to be eaten by her brother; the young couple in love are burned alive and eaten; the nuclear family is (literally) torn apart as a zombified child chomps on her dad and then hacks mom to death with a trowel.

Just when dawn is breaking and rescue seems imminent for Ben–who has, against all odds, survived the long night of the zombie hordes–he is nonchalantly gunned down. Shot between the eyes by a white sheriff’s posse, he is reduced to “another one for the fire,” his body dragged away with hooks.

As Renée Graham wrote recently in the Boston Globe about seeing the film in 1968:

Everyone in the posse is white. Ben is African American. I was a child, but the message I received was depressingly clear: They killed Ben because they believed a Black man had to be a threat. A Black hero equaled a dead hero…

Already, 1968 had been a beast of a year. On an April night, as my family prepared to celebrate my father’s birthday, a TV bulletin announced the assassination of Martin Luther King Jr. Two months later, still wearing a costume from my dance recital that evening, I stood in front of the television watching mourners wave, weep and salute as a train carried the body of Robert F. Kennedy, murdered days earlier. For months, every adult around me walked around in agony and silence.

Yet nothing that year affected me as profoundly as watching Ben die…

Night of the Living Dead made Romero a legend by expanding the audience’s concept of what a horror movie could be. Its resonance for me still cuts deeper. Whatever his original intentions, Romero’s classic taught me early and indelibly that the real monsters who threaten us aren’t undead ghouls stalking the night.

“We always had that ending,” Romero commented later. “It seemed like the only fitting end. And even though the posse goes rolling across the countryside, leaving our hero dead, we get the feeling that they are not going to win either. There’s this new society coming. In the end, none of this is going to work, guys.”

– – – – – – – – – – – – – – – –

LATER INSTALLMENTS of Romero’s Living Dead series would make the social commentary more explicit–though none to the same brilliant extent as Night.

Romero would state in one DVD commentary that while writing or making a movie, the points he would make about society were often more important to him than the characters–which led to politically interesting films, though the storytelling sometimes suffered.

Dawn of the Dead, perhaps Romero’s best film outside of Night, is a satire on consumer culture. In it, survivors take refuge from zombies inside Pittsburgh’s Monroeville Mall, leading to hilarious sequences of “mindless consumption”–both by human survivors and, of course, the zombies. “This was an important place in their lives,” one character says in explaining why the zombies would be drawn to a shopping mall after dead.

In 1985’s Day of the Dead, Reagan-era militarism comes under fire. A group of scientists and soldiers are stuck in an underground military base, coming into conflict over the macho militarism of the soldiers in charge, as well as the grisly experiments by one of the scientists, who believes that zombies can be made docile.

Of particular interest in today’s current political climate, however, is Romero’s 2005 Land of the Dead–an uneven, though often hilarious, entry in the Living Dead series.

A maniacal and entertainingly campy Dennis Hopper plays a character clearly modeled on our current president. In the face of societal collapse, he has convinced the super-rich to move into a self-enclosed, Trump-Tower-like, high-rise called “Fiddler’s Green.” The rest of the human population lives outside in squalor and fear, while Hopper’s character sponsors mercenaries to go out on runs and collect supplies for the rich.

At the core of the film is the question of the allegiances of the mercenaries and the potential of a growing zombie consciousness. Led by a Black zombie mechanic, the zombies begin to communicate and organize. The climax of the film hinges on a moment of zombie-human solidarity that is both funny and deeply satisfying for socialists to watch.

As Romero once said, “The zombie for me was always the blue-collar monster. He is us.”

Also well worth watching is Martin, Romero’s 1978 meditation on the vampire genre–which asks what makes a monster and what kind of psychological dysfunction lurks inside the house next door.

Though uneven, the film takes joy in exploding romanticized notions about vampires–there’s nary a sexy or sparkly vampire anywhere in it–to get down to brass tacks in examining what might be thought of as vampire working conditions.

– – – – – – – – – – – – – – – –

“I’VE BEEN able to use genre of fantasy/horror and express my opinion, talk a little about society, do a little bit of satire,” Romero once said, “and that’s been great, man.”

Marxist film scholar Robin Wood, a staunch defender of Romero’s work into the 1980s, would go farther, claiming:

It is perhaps the lingering intellectual distrust of the horror genre that has prevented George Romero’s Living Dead [series] from receiving full recognition for what it undoubtedly is: one of the most remarkable and audacious achievements of modern American cinema, and the most uncompromising critique of contemporary America (and, by extension, Western capitalist society in general) that is possible within the terms and conditions of a “popular entertainment” medium.

So in tribute to George Romero, don’t wait until Halloween this year to watch his movies. With the monsters in Washington revealing their true natures every day, we should remember what Romero thought about the potential of horror:

Horror is radical. It can take you into a completely new world, new place, and just rattle your cage and say, wait a minute–look at things differently. That shock of horror is what horror’s all about. But in most cases, at the end of the story, people try to bring everything back–the girl gets the guy and everything’s fine and things go on just the way they were. Which is really why we are doing this in the first place. We don’t want things the way they are or we wouldn’t be trying to shock you into an alternative place.

So go ahead–when it comes to George Romero movies, consume away.

https://socialistworker.org/2017/08/02/the-american-nightmare

What if all journalists wrote like tech journalists?

Hypnotized by Silicon Valley’s hype machine, too many tech writers are little more than fawning lackeys

What if all journalists wrote like tech journalists?

Apple CEO Tim Cook(Credit: AP/Richard Drew)

Ever since the ’90s tech boom, tech journalists have tended to cover their beat a little differently. More specifically, compared to other journalistic fields, tech journalism is more likely to be reverent, even fawning, toward the subjects it is supposed to critique.

“Visit any technology-focused media outlet, or the tech sections of many news organizations, and you’ll see that ‘gadget porn’ videos, hagiographic profiles of startup founders or the regurgitation of lofty growth expectations from Wall Street analysts vastly outnumber critical analyses of technological disruption,” wrote my Salon colleague Angelo Young, in his article about how Silicon Valley sells a side of ideology with its gadgets. “[C]riticisms that do exist tend to focus on ancillary issues, such as Silicon Valley’s dismal lack of workplace diversity, or how innovation is upsetting norms in the labor market, or the class-based digital divide; all are no doubt important topics, but they’re ones that don’t question the overall assumptions that innovation and disruption are at worst harmless if not benevolent.”

This is the dismal state of tech journalism in the digital era: Because of the tech industry’s success at branding itself as selling a sunny, progressive vision of the future, most (but not all) tech journalists don’t really cover it with a critical eye. Some publications and blogs read like advertisements for gadgets, and breathlessly cover minor, quotidian firmware updates as if they were front-page stories.

Which leads us to a thought experiment: What if all news stories were written the way tech journalists cover their beats? What if all journalism was infected with the same cultish CEO adulation, the fawning adoration, the puff pieces about how this week’s disposable gadget is the most amazing thing to have ever existed? How might the news look to us then?

Reader, here’s a take on a shiny new product — not in the tech industry, but the fast-food industry — written from just that perspective.

It’s here: McDonald’s Lobster Roll 7

McDonald’s adds a fresher crustacean and an improved crunch at a higher price point — but should you buy it?

Reviewing the new Lobster Roll 7 is a lot like reviewing the previous models, but different. The roll, which was announced last week at the annual Worldwide Food-Eater’s Conference (WFEC) at the futuristic McDonald’s campus in Oak Brook, outwardly resembles the last Lobster Roll — but inside, it’s totally new.

Until recently, the notoriously secretive fast-food company had been mum about the next generation of its ever-popular sandwich. For the past two years, journalists have had to rely on the slow trickle of supply-chain rumors and leaked photos of questionable veracity to guess at the nature of the next model. The hype machine is so overblown that there’s a cottage industry of excitable vloggers who make a living creating speculative computer mock-ups of the next Lobster Roll and posting them on YouTube.

But back to the WFEC reveal: Perhaps the most surprising announcement from the Wizards of Oak Brook was the changed naming scheme. No more McLobster. It’s just Lobster Roll 7 now.

More elegant, perhaps, and an intriguing shift in marketing strategy for the decidedly minimalist company. Indeed, since meticulously engineering the first McLobster, the company has opted to make subtle year-to-year changes in its sandwich design, a strategy that makes sense given its astonishing popularity. Many reviewers believed the second generation model was “near-perfect”; how can you improve on perfection? Tinkering is all that’s left. Yet as sandwich tech improves, we can surely expect to see some of the fruits of that innovation implicated in this model. McDonald’s certainly spends a pretty penny on R&D.

If you take a closer look, most of the outward changes to this Lobster Roll model are subtle. For example, the way the lobster is cut is more angular than before — an interesting design decision for a company that always had a fetish for bevels. And for the first time, the lettuce no longer comes from Korean-based Lettuce International; relations between the two companies have been frosty ever since LI began selling a competitor crustacean roll.

While anticipation was sky-high for the next iteration of the Lobster Roll line, questions linger about McDonald’s ability to continue to dominate the industry. Mickey D fanboys have reason to be suspicious of this release; ever since the first groundbreaking McLobster was released in 1993, the crustacean fast-food space has become way more crowded. Rival Panera has been innovating in the crustacean sandwich world for the past few years, and has built up an impressive condiment ecosystem to rival that of McDonald’s. Likewise, the new Lobster Roll 7 will be the first iteration of the McDonald’s line released under the aegis of new CEO Steve Easterbrook, who last year took the reins from his iconic, brilliant, industry-defining predecessor, the ever-reclusive wunderkind Don Thompson — whose leadership of the industry-defining company has been immortalized in five biopics over the past 10 years, including an eponymous 2016 film written by Aaron Sorkin.

But enough about expectations. What about the product?

Let’s just say that fanboys are likely to call this Lobster Roll “world changing” (many of them without even tasting it, if the long lines outside McDonald’s stores across the globe are any indication). You may roll your eyes at such an epithet, but they may indeed be right. Even the most crotchety reviewers will likely admit that the Lobster Roll 7 epitomizes a paradigm shift for the industry. And to those who say it’s barely an innovation on previous models, I say this: Why mess with perfection?

Keith A. Spencer is a cover editor at Salon.

CNN’s “The Nineties”: Empty nostalgia for a decade we should let die

CNN delves into a decade of pat neoliberalism and hollow spectacle and, unsurprisingly, comes up with nothing

CNN’s “The Nineties”: Empty nostalgia for a decade we should let die
The Nineties (Credit: CNN)

To anyone who came of age in the 1990s, the current cultural ascent of fidget spinners is likely to induce an acute pang of recognition — equal parts wistful nostalgia, anxiety and woozy terror. The ‘90s were, as any certified “Nineties Kid” can attest, a decade marked by a succession of asinine schoolyard fads.

One can imagine an alternative timeline of the decade that marks time not by year, but the chronology of crazes: the Year of the Beanie Baby, the Year of the Tamagotchi, the Years of the Snap-Bracelet, the Macarena, the Baggy Starter Jacket, the Painstakingly Layered “The Rachel” Hairdo, and so on. What’s most remarkable about our culture’s whirring fidget spinner fetish is that it didn’t happen sooner; that this peak fad didn’t emerge from among the long, rolling sierra of hollow amusements that defined the 1990s.

Surveying the current pop-culture landscape, one gets the sense that the ‘90s— with all its flash-in-the-pan fads and cooked-up crazes — never ended. On TV, “The Simpsons” endures into its 28th season, while David Lynch and Mark Frost’s oddball ABC drama “Twin Peaks” enjoys a highly successful, and artistically fruitful, premium-cable revival. The Power Rangers, Ninja Turtles, Transformers and Treasure Trolls have graduated from small-screen Saturday morning silliness to blockbuster entertainments.

Elsewhere, the “normcore”/“dadcore”/“lazycore” fashion of singers like Mac DeMarco has made it OK (even haute) to dress up like a “Home Improvement”-era Jonathan Taylor Thomas. And Nintendo recently announced its latest money-printing scheme, in the form of the forthcoming SNES Classic Mini: a handheld throwback video game platform chock-full of nostalgia-baiting Console Wars standbys like “Donkey Kong Country,” “F-Zero” and “StarFox.” Content mills like BuzzFeed, Upworthy and their ilk bolster their bottom line churning out lists and quizzes reminding you that, yes, the show “Rugrats” existed.

To quote a nostalgic ’97-vintage hit single, which was itself a throwback to ‘60s jazz-pop, it’s all just a little bit of history repeating.

It’s natural to languish for the past: to trip down memory lane, get all dewy-eyed about the past, pine for the purity of the long-trampled gardens of innocence, and go full Proust on the bric-a-brac of youth that manages to impress itself on the soft, still-maturing amber of the adolescent mind, even if that stuff was total crap like Moon Shoes or a Street Shark or Totally Hair Barbie doll or a bucket of Nickelodeon-brand goo called “Gak.” The 1990s, however, offered a particularly potent nostalgia trap, something revealed watching CNN’s new TV documentary miniseries “event,” fittingly called “The Nineties.”

A follow-up to CNN’s previous history-of-a-decade events (“The Sixties,” “The Seventies” and “The Eighties”) and co-produced by Tom Hanks, the series provides some valuable insight into the nature of ’90s nostalgia. The two-part series opener, called “The One About TV,” threads the needle, examining the ways in which television of the era shifted the standards of cultural acceptability, be it in Andy Sipowicz’s expletive-laden racism, Homer Simpson’s casual stranglings of his misfit son or the highbrow, Noel Coward-in-primetime farces of “Frasier.”

To believe CNN’s procession of talking heads, damn near every TV show to debut after midnight on Jan. 1, 1990, was “revolutionary.” “The Simpsons” was revolutionary for the way it hated TV. “Twin Peaks” was revolutionary for the way it subverted it. “Seinfeld” ignored (or subtracted, into its famous “Show About Nothing” ethic) the conventions of the sitcom. “Frasier” elevated them. “Will & Grace,” “Ellen” and “The Real World” bravely depicted gay America. Ditto “Arsenio,” “Fresh Prince” and “In Living Color” in representing black America. “OZ” was revolutionary for its violence. “The Sopranos” was revolutionary in how it got you to root for the bad guy. “Friends” was revolutionary because it showed the day-to-day lives of, well, some friends. If the line of argumentation developed by “The Nineties” is to be believed, the TV game was being changed so frequently that it was becoming impossible to keep up with the rules.

Despite seeming argumentatively fallacious (if everything is subversive or game-changing, then, one might argue, nothing is), and further debasing the concept of revolution itself, such an argument cuts to the heart of ‘90s nostalgia. In pop culture, it was an era of seeming possibility, where it became OK to talk about masturbation (in one of “Seinfeld’s” more famous episodes) or even anal sex (as on “Sex & the City”), where “Twin Peaks” and “The Sopranos” spoke to the rot at the core of American life. “The Nineties” paints a flattering, borderline obsequious portrait of Gen-X ’90s kids as too hip, savvy and highly educated to be suckered in by the gleam and obvious propaganda that seemed to define “The Eighties.” (The ’90s kid finds a generational motto in the tagline offered by Fox’s conspiratorial cult sci-fi show “The X-Files”: trust no one.)

What “The Nineties” misses — very deliberately, one imagines — is the guiding cynicism of such revolutions in television. Far from being powered by a kind of radical politics of inclusivity, TV was (and remains) guided by its ability to deliver certain demographics to advertisers. In the 1990s, these demographics splintered, becoming more specialized. Likewise, entertainment streams split. The bully “mean girls” watched “90210,” the bullied watched “My So-Called Life,” and the kids bullied by the bullied watched “Buffy the Vampire Slayer.” Then on Thursday night, everyone watched “Seinfeld.”

This parade of prime-time cultural revolutions betrayed the actual guiding political attitude of the decade: stasis. The second episode of “The Nineties” turns to the scandal-plagued political life of Bill Clinton. “A new season of American renewal has begun!” beams Clinton, thumb pressed characteristically over a loosely clenched fist, early in the episode. For the Democrats, Bill Clinton seemed like a new hope: charming, charismatic, hip, appearing in sunglasses on Arsenio to blow his saxophone. But like so many of TV’s mock-insurgencies, the Clinton presidency was a coup in terms of aesthetics, and little else.

Beyond his sundry accusations of impropriety  (Whitewater, the Paula Jones and Monica Lewinsky sex scandals, etc.), Clinton supported the death penalty, “three strikes” sentencing, NAFTA, “don’t ask, don’t tell” and countless other policies that alienated him from his party’s left-progressive wing. Clinton embodied the emerging neoliberal ethic: cozying up to big banks and supporting laissez-faire economic policies that further destabilized the American working and middle classes, while largely avoiding the jingoist militarism, nationalism and family values moralism of ‘80s Reaganomics. Clinton’s American renewal was little more than face-lift.

“The Simpsons,” naturally, nailed this devil-you-know distinction in a 1996 Halloween episode, which saw the bodies of Bill Clinton and then-presidential rival Bob Dole inhabited by slithering extraterrestrials. Indistinguishable in terms of tone and policy, the body snatching alien candidates beguiled the easily duped electorate with nonsensical stump speeches about moving “forward, not backward; upward, not forward; and always twirling, twirling, twirling towards freedom.”

A 1992 book by the American political scientist Francis Fukuyama summed up the ’90s’ neoliberal approach to politics. In “The End of History and the Last Man,” Fukuyama posited that the collapse of the Soviet Union following the Cold War had resolved any grand ideological and historical conflicts in world politics. Liberal democracy and capitalism had won the day. Free market democracy was humanity’s final form. History — or at least the concept of history as a process of sociological evolution and conflict between competing political systems — had run its course.

Following the publication of “The End of History,” Fukuyama became an institutional poli-sci Svengali (John Gray at the New Statesman dubbed him the “court philosopher of global capitalism”), with his ideas holding significant major sway in political circles. The 1990s in America, and during the Clinton presidency, in particular, were a self-styled realization of the “end of history.” In the wake of the Cold War and collapse of the Berlin Wall, the president’s position was largely functionary: enable the smooth functioning of markets, and the free flow of capital. Such was the horizon of political thought.

Fukuyama’s book has been subjected to thorough criticism for its shortsightedness — not least of all for the way in which its central argument only serves to consolidate and naturalize the authority of the neoliberal elite. More concretely, 9/11 and its aftermath are often cited as signals of the “revenge of history,” which introduces new, complicated clashes of world-historical ideologies.

Though it’s often touted for its triumphalism, as a cheerleading handbook for the success of Westernized global capitalism, Fukuyama’s end of history theory is suffused with a certain melancholy. There’s one passage, often overlooked, which speaks to the general content and character of the ’90s (and “The Nineties”). “The end of history will be a very sad times,” he writes. “In the post-historical period there will be neither art nor philosophy, just the perpetual caretaking of the museum of human history. I can feel in myself, and see in others around me, a powerful nostalgia for the time when history existed. Such nostalgia, in fact, will continue to fuel competition and conflict even in the post-historical world for some time to come.”

Our fresh new millennium has been marked, in political terms, by cultural clashes between decadent Western liberalism and militant Islamism (both sides bolstering their positions with the hollow rhetoric of religious zealotry), the abject failure of both the Democratic and Republican parties, the reappearance of white supremacist and ethno-nationalist thinking, the thorough criticism of neoliberalism, and the rise of a new progressive-left (signaled by the popularity of Jeremy Corbyn and Bernie Sanders), alongside a similarly invigorated form of moderatism referred to as “the extreme centre.” Amid such wild vicissitudes, the placid neoliberal creep of Fukuyama’s “post-history” feels downright quaint.

This is the sort of modern nostalgia that CNN’s “The Nineties” taps into: a melancholy for the relative stability of a decade that was meant to mark the end of history itself. Not only did things seem even-keeled, but everything (a haircut, a GameBoy game about tiny Japanese cartoon monsters, a sitcom episode about waiting for a table) seemed radical, revolutionary and, somehow, deeply profound. We are, perhaps invariably, prone to feeling elegiac for even the hollowness of A Decade About Nothing. It’s particularly because the 1990s abide in our politicians, our ideologies, our prime-time entertainments, blockbusters movies and even, yes, in our faddish toys, designed to ease our fidgety anxiety about the muddled present, and keep us twirling, twirling back into memory of a simpler, stupider past.

John Semley lives and works in Toronto. He is a books columnist at the Globe & Mail newspaper and the author of “This Is A Book About The Kids In The Hall” (ECW Press).

British Police on Guard Against Hordes of Acid Heads Seeking Historic LSD Stash

DRUGS
Is there really a hoard of ancient acid somewhere near a quaint Welsh village?

Photo Credit: Creative Commons/Wikimedia

Police are patrolling a quiet Welsh village amid concerns that criminals could try to unearth a huge stash of LSD said to have been secreted there four decades ago.

In 1977 police raided an old mansion in the village of Carno in west Wales and smashed a multimillion pound drugs operation said to have been supplying up to 90% of the LSD being used in the UK.

One of the detectives involved in the investigation, codenamed Operation Julie, is now claiming some of the stash may not have been found and may still be in pristine condition even after all these years.

Dyfed-Powys police are taking the claim seriously enough to warn villagers in Carno, near Llanidloes, that unwanted visitors may be on the way. They have told the owners of a house used as a drugs factory 40 years ago, Plas Llysyn, of the revelations and stepped up patrols in the area.

A police spokesperson said: “Dyfed-Powys police are aware of the issue and are assessing the content of the disclosure. We will be checking the records we hold to establish whether or not matters raised warrant further investigation.

“In the meantime we will be making the current owners of Plas Llysyn aware of the disclosure and the potential for persons to visit the area in an attempt to locate the drugs. We will be providing them with reassurance through increased patrols.

“We would also like to make it clear Dyfed-Powys police take a robust approach to drug trafficking and that appropriate action will be taken in respect of anyone suspected of using the information disclosed to assist them in obtaining and supplying controlled drugs.”

Operation Julie was launched after one of the men at the centre of the drugs racket was involved in a car crash in the town of Machynlleth, west Wales. Police pieced together a torn-up note and found it mentioned a key ingredient of LSD.

Some 800 officers, some of them disguised as hippies seeking the good life, descended on west Wales. There were elements of farce to the saga. Some of the undercover officers had fights with local police to maintain their cover.

One group of male officers was close to being rumbled after locals began to suspect them of being a gay cult and started to take a close interest in them. That led to female officers being introduced, including Sgt Julie Taylor, who was to be immortalised in The Clash song Julie’s Been Working For the Drug Squad.

After months of painstaking surveillance the police swooped and seized 1m tabs and enough raw materials to make a further 6.5m. A total of 120 arrests were made, resulting in 15 convictions and prison sentences totalling 120 years. The price of an LSD tab is said to have rocketed overnight from £1 to £5.

In a new edition of his book on the story, Undercover: Operation Julie – The Inside Story, the former officer Stephen Bentley said a statement from one of the gang members had claimed a substantial amount of LSD had been buried in a woodland near Plas Llysyn.

Bentley said he had only recently got hold of the statement – and believed it was true. “I have made my mind up. That stash is almost certainly still there,” he said.

Season Three of Better Call Saul: Objection! Relevance!

By Ed Hightower
1 July 2017

A show honestly exploring the ins and outs of the legal profession, particularly those of its younger members, could hardly fail to win an audience. Who are these people who, after four years of undergraduate study, take the Law School Aptitude Test (LSAT), apply to the best law schools they might expect admission to, then study the official rules of social and economic life for three more years; study for a comprehensive licensing exam (the bar exam or simply “the bar”), and then finally earn the right to represent clients in court and with it, a chance to repay their massive student loans?

What motivates such students? Hope of riches? The fight to protect the innocent, or to compensate victims of official wrongdoing? What are their struggles? Do they have a family or personal life? How do they sharpen their skills, get along with their opponents and judges, and their employers?

Bob Odenkirk and Rhea Seehorn

Better Call Saul is the prequel to Breaking Bad, a hit drama about financially struggling high school chemistry teacher Walter White (Brian Cranston) who turns his life upside down when he begins to manufacture the highly addictive drug methamphetamine. Walter White employs the crooked, flamboyant attorney Saul Goodman (Bob Odenkirk) to handle legal problems and otherwise assist in directing the operations of a drug cartel.

In Breaking Bad, Odenkirk masterfully portrays the life-like Saul Goodman—born Jimmy McGill, he changed his name to a Jewish one to entice prospective clients who prefer a lawyer who is, in his words, “a pipe-smoking member of the tribe [of Israel].” Throughout the show, Goodman spouts such gems as, “Remember, there’s no honor among thieves, except for us,” and “I once convinced a woman I was Kevin Costner.” His comic relief is welcome, and his thwarting of the law enforcement agencies encircling White earns sympathy, even for a lawyer bending the rules to assist a meth dealer.

To the credit of Better Call Saul’s creators, Seasons One and Two told a story about Jimmy McGill the young go-getter who takes correspondence courses in law while working in the mail room at his brother Charles’ enormous law firm, Hamlin Hamlin McGill. Certain scenes in these prior seasons rang true. One montage stands out, showing Jimmy making the paces defending indigent clients day in and day out for the lawyer’s equivalent of minimum wage, as he endears himself to the courthouse power brokers: the sheriffs’ deputies, prosecuting attorneys and, most importantly, the clerks.

Jimmy’s eagerness for success convincingly takes the viewer through a number of problems that most all lawyers encounter, including clients with unrealistic expectations. One woman insists that her husband, who embezzled $1 million from the local government, should serve no jail time and get to keep the money. Another wants to pay Jimmy to help him secede from the United States, and to pay him in money that he himself has printed. And an elderly woman wants to describe each and every figurine she has and ensure that her last will and testament spells out who will receive each one…and she only has $20 to pay him.

Even at its best Saul was always at least half cop drama, however.

To that end, a miserable character who ultimately becomes Saul’s private investigator plays a role that earns at least as much screen time as the ostensible protagonist himself does. The ex-cop-turned-crook Mike Ehrmantraut faces hackneyed moral dilemmas as he tries to support his daughter-in-law and granddaughter. As background to the story, his son was killed by fellow police officers when he refused to conspire with them to deal in confiscated contraband.

In Season Three, the cameras virtually caress Ehrmantraut as he uses high-powered rifles, GPS tracking devices, and otherwise plays cat-and-mouse games in furtherance of various crimes. One scene of him tracking and being tracked by up-and-coming meth kingpin Gus Fring carries on for what seems like hours. This clever tough guy with a family to protect largely aims at the viewer’s simplest instincts.

Jonathan Banks as Mike Ehrmantraut

As the season wears on, the practice of law falls off the map almost entirely when Jimmy suffers a suspension of his law license. (The disciplinary process is very well done, both dramatically and cinematically.) Episodes that concern Jimmy’s venture into advertising are unwatchable. One gets the sense that the creators have run out of anything meaningful to say.

Worse still, they undermine what was left of Jimmy as a character supposedly torn between vice and virtue. In order to procure his fees in a lucrative civil case, Jimmy connives to destroy an elderly client’s quiet, peaceful life without a second thought. Prior relations with that client portrayed Jimmy as genuinely empathetic to this woman, and eager to get justice for her. To watch the backstabbing is to witness not character development, but rewriting. It is as lazy as it is unconvincing.

In this and many other instances, the creators of Better Call Saul flirt with the old prejudices against attorneys, espoused most consciously by the extreme right and other advocates of big business.

Reality, and history too, diverge from this gross oversimplification and irresponsible misrepresentation. A list of historical figures that knew their way around the courtroom would include Lincoln, Jefferson, Hamilton, Adams, Darrow, Kafka, Robespierre and Lenin. There is not a charlatan or self-seeker among them.

In contrast to Jimmy’s derangement, drug kingpin Gus Fring presents as a model employer at the money laundering front he operates. When a rival cartel terrorizes Fring’s employees, he makes a nauseating speech afterwards about not backing down, not in the United States, where good people have nothing to fear etc.—and he gives them overtime and paid counseling should they need it. Nice fellow!

One wants to point out to the show’s creators that if the front business—a fried chicken restaurant—is so successful, why don’t the kingpins just sell fast food?

Greed, in short, seems to be their answer to everything, a supra-historical character flaw, almost an original sin in the case of Jimmy McGill.

Better Call Saul will appear on AMC for a fourth season. One imagines that little light will be shed on the practice of law, or the social relations it is rooted in, or much of anything else, saving most screen time for cops and robbers. The latter can be seen on virtually any other channel too, of course.

 

WSWS