Allison Williams as Peter Pan (Credit: NBC/Nino Munoz)
“Clap your hands if you believe!” a girl dressed unconvincingly as a boy implored the camera, breaking the fourth wall in a moment that is probably supposed to be really touching. Tinkerbell, a tiny blob of CGI, is dying in a faux-mossy lantern. The girl-boy — Allison Williams, playing Peter Pan — is dressed in studied scraps of fishnet and green leather breeches, and she is directing her concern at a camera that we have spent the whole evening dutifully pretending doesn’t exist, as it follows her around cavorting on wires hanging from the ceiling.
NBC’s “Peter Pan Live!” promises to be my next holiday tradition (and when NBC says holiday, it means Christmas), so it looks like I’m going to have very depressing Decembers from here on out. But before I wrap myself up in blankets on my easy chair and resign myself to a lifetime as Ebenezer Scrooge, please allow me to complain a little.
I understand that this live musical thing is a huge success for NBC — “The Sound Of Music” netted a cool 18 million viewers, which is an insane number for any programming that isn’t a sporting event in this day and age. For cast and crew, it’s a thrilling opportunity to make something that can’t be improved upon by a second take; for audiences, it’s a chance to come together for some appointment viewing (read: group snarking). Something could happen, is the exciting, fun part of live television; it’s what draws audiences to “Saturday Night Live,” the Super Bowl, and the Academy Awards, for different reasons.
But NBC’s production of this 1904 musical did nothing different, interesting or risky. (It wasn’t even really live! The singing was all pre-recorded tracks. What was the point, NBC?) Aside from a few changes to Tiger Lily’s song to make it slightly less racist, “Peter Pan Live!” was “Peter Pan,” more or less intact. Which is mind-boggling. If Disney had produced this, audiences would be asking: Why is it so overwhelmingly white? Why wasn’t Tiger Lily’s role rethought or cut entirely?
And the most obvious response to all of this, naturally, is that “Peter Pan” isn’t meant for television, because it’s a play, and it’s not meant for modern audiences, because it was written in 1904. But then that leads to the most obvious question that struck me as I was watching last night: Why on earth would anyone make this show in 2014? As the fabulous and opinionated Tom and Lorenzo wrote this morning: “There’s a difference between ‘old-fashioned entertainment’ and ‘offensive minstrel shows’ and this falls somewhere in the middle.”
We live in a world where a feminist retelling of the book of Genesis is a bestselling book and an upcoming miniseries on Lifetime. Where children’s fables are being unpacked and retold to include more female and minority perspectives. The “Hunger Games” franchise and the Marvel and DC universes are engaging with complex, dystopian themes in their storytelling. We are not shrinking violets in our American living rooms, and neither are our children, and yet this version of “Peter Pan” is like a time capsule from 1904, unwilling to do anything to disturb the fragile social norms of a bunch of long-dead white Brits.
NBC hewed with the standard set by Mary Martin in the ‘50s and went with Peter Pan being played by an adult woman, and a classically feminine-looking one at that. The result was a lot of hilarious queer subtext — hilarious not because a bunch of women in a love quadrangle is inherently funny, but because it was so obvious that NBC had no intention of creating that subtext. The plot has something to do with all women wanting sex from Peter while he calls them “Mother”; also hilarious, but also depressingly unintentional.
“Peter Pan Live!” was, to be fair, executed adequately. It’s competent, and not a lot more than that. Allison Williams didn’t make a mistake on live television, but didn’t exactly win the hearts of thousands, either. Christopher Walken seemed to understand how cheesy it was, and went for it with glee; he’s not exactly hamming it up, but he’s mincing up a storm during the dance numbers. The camerawork couldn’t quite decide if it was going to follow the actors around or sit back and watch them, the way an audience would; the tension between the two made for some seasickness. And all of the delightful shortcuts of theater — which require your willing suspension of disbelief in exchange for the thrill of watching players act out a story live, right in front of you — looked terrible on-screen, particularly if you happened to be watching in HD. (The comb Peter gives Wendy was tacky spray-painted plastic, and they knew it, and we knew they knew it.)
“Peter Pan Live!” was mediocre, bland and depressingly safe, and because Wal-Mart spent millions of dollars tying in advertising throughout the whole thing we’re likely to see dozens more musicals brought boringly to life around the holidays. I’m not looking forward to it. But Scrooges, fortunately, are comforted by all the great tweets.
Among the many things that make the United States such an exceptional nation, its relative unwillingness to spend money on programs to better its citizens’ lives is especially notable. Ditto its utterly unrivaled enthusiasm for spending its money on programs to make it easier to end other citizens’ lives. But while it’s true that Americans work more for less, it’s also true that no other country’s political class is quite so festooned with top-of-the-line killing machines, or so unencumbered when it comes to deploying those killing machines wherever and whenever they please.
Assuming you’re not a defense contractor lobbyist or lifetime bureaucratic warrior in the Pentagon, it doesn’t sound like too good a deal for the vast majority of America’s 300-million-plus population. But as President Obama showed during Sunday night’s new interview with Steve Kroft for “60 Minutes,” there’s a tried and true way that U.S. leadership manages to square that circle: By telling Americans that the globe is in many ways like a big university — one where the United States is the undisputed big man on campus.
“It looks like once again we are leading the operation,” Kroft complained to the president, noting that despite Obama’s efforts to build a broad coalition for his war against ISIS, the United States found itself still in the role of first among equals when it came to shouldering the campaign’s burden. It was a pointed question to deliver to a president who was ushered into office in part on a promise to wield America’s military more wisely, more judiciously and with more of a mind on the problems unresolved at home.
Still, President Obama, that one-time candidate of change, had a quick and direct answer: “Steve, that’s always the case. That’s always the case. America leads. We are the indispensable nation; we have capacity no one else has; our military is the best in the history of the world.
“When trouble comes up anywhere in the world,” Obama continued, “they don’t call Beijing, they don’t call Moscow — they call us.”
Having reduced geopolitics to the level of “Ghostbusters” (because when there’s sectarian killing born from a centuries-long ethnic and cultural conflict in your neighborhood, who ya gonna call?) Obama continued, “When there’s a typhoon in the Philippines, take a look at who’s helping the Philippines deal with that situation. When there’s an earthquake in Haiti, take a look at who’s leading the charge, making sure Haiti can rebuild.”
Obama then laid down the hammer, delivering the sound bite one imagines White House message mavens thought was terrifically badass when they came up with it during the waning hours of an all-night planning session some recent, godforsaken morning: “That’s how we roll,” the president of the United States said. “That’s what makes us America.” (“Bring ‘em on!” was already taken.)
So if in years ahead — perhaps during a time when the debate has shifted from whether to send troops back to Iraq to how many troops we should send; or perhaps during the next time when a temporary economic downturn persuades the most serious people in Washington that the welfare state is a luxury the United States cannot afford — you find yourself wondering why the debate in Washington is always between less welfare and more war now or less welfare and more war later, remember what Barack Obama told you.
Ken Burns’s superb documentary, The Roosevelts: An Intimate History, is in many ways a celebration of leadership, of the triumph of personal will over adversity, and of the belief in the age-old American story that each of us – no matter how burdened by life’s tragedies – has the capacity to accomplish great things.
The film also has much to say about the transformative nature of government: the idea, which all three Roosevelts shared, that it was the responsibility of government to serve as the primary guarantor of social and economic justice for all Americans – not just the privileged few at the top. It was this belief that formed the basis of Theodore Roosevelt’s New Nationalism and Franklin D. Roosevelt’s New Deal, and this belief that helped inspire Eleanor Roosevelt’s efforts to craft the Universal Declaration of Human Rights that was ratified by the United Nations just three years after its 1945 founding.
What is often overlooked in this story is the role that all three of these remarkable leaders played in helping to preserve the American free enterprise system, of trying to mitigate the worst excesses of capitalism, not only out of a desire to protect the American people from exploitative labor practices or fraudulent financial dealings, but also out of a desire to protect our very way of life during an era when liberal capitalist democracy was under siege in much of the rest of the world. As the late Arthur Schlesinger Jr., once remarked, the twentieth century in many respects can be viewed as a struggle of ideologies, a time in which the anti-democratic forces of fascism and totalitarian communism were on the march, so that by January 1942 at the height of the Second World War, there were only a handful of democracies left on the planet.
In the rhetorically charged atmosphere of the mid 1930s, FDR’s critics alleged that the reforms he instigated under the New Deal were designed to take the country down the path to socialism. But nothing could be further from the truth. Social Security, unemployment insurance, and granting labor the right to organize were all inspired by the desire to provide the average American with a basic degree of economic security within the capitalist system. So too were the many financial reforms that brought us the likes of the Federal Deposit Insurance Corporation and the Securities and Exchange Commission. The same argument could be made about Theodore Roosevelt, whose decision to take on such conglomerates as the Beef Trust or the Northern Securities Rail Company was driven by the desire not to destroy big business but to limit monopoly and restore the cut and thrust of the free market. In short, both men were motivated by the idea that the federal government had a responsibility to make capitalism work for the average American.
Eleanor Roosevelt concurred with these ideas, and in spite of her reputation as a left-leaning reformer, spent much of her considerable energy in the post-1945 world arguing in favor of the World War II monetary and trade reforms that helped launch the globalization of the world’s economy. In her May 21, 1945 “My Day” column, for example, ER spoke out in favor of the 1944 Bretton Woods accords which established the International Monetary Fund and International Bank for Reconstruction and Development, later the World Bank. Here, she argued in favor of the stabilization of currencies, because in the past there had been much speculative trading in this area, which resulted in “economic warfare” that in time brings us to “shooting warfare.” And she had this to say about the establishment of the International Bank for Reconstruction and Development:
Some foolish people will ask: Why do we have to concern ourselves with the development and reconstruction of the ruined countries? The answer is simple. We are the greatest producing country in the world. We need markets not only at home, but abroad, and we cannot have them unless people can start up their industries and national economy again and buy from us. If Europe or Asia falls apart because of starvation or lack of work for their people, chaos will result and World War III will be in the making. In that event, we know that we will have to be a part of it.
Hence, ER insisted that we needed “both the bank and the fund for our own security, as well as for that of the rest of the world.” She then urged her readers to write to their Senators and Congressmen in support of the treaty, for as she so eloquently put it:
Whether you are a farmer or a merchant, whether your business is big or little, you are personally affected by it. Even if you don’t sell directly to a foreign country, you are indirectly affected – for the prosperity of the[foreign] country means your prosperity, and we cannot prosper without trade with our neighbors in the world of tomorrow.
As is so often the case, when we look back we see that the challenges of the past are not that different from the challenges we face today. Once again we face a world where the free-market system is in desperate need of reform; a world where income inequality has reached levels not seen since the gilded age; a world where the specter of long-term unemployment and limited opportunity has dimmed the hopes of an entire generation; a world where poverty and a lack of opportunity have given rise to anti-democratic extremists that threaten the very lives and well-being of millions. Yet sadly, and unlike the heady days of the first six decades of the twentieth century, our leaders in Washington seem incapable or unwilling to shape a response to these many challenges befitting the legacy of such great political figures as Theodore, Franklin, and Eleanor Roosevelt.
A great deal of this can be attributed to the irresponsible behavior of many members of Congress, particularly among the members of the extreme right, whose obstructionist policies and rigid anti-government ideology have played a significant part in rendering the 113th Congress one of the least effective and least respected in American history.
But we should also never forget – as Ken Burns and his outstanding script writer Geoffrey Ward have reminded us through this outstanding film – that we too must share part of the blame. For as much as we may admire the leadership of the Roosevelts, none of their accomplishments would have been possible without the support of the American people. Leadership, after all, is a dynamic process that requires the cooperation of the both public figures and the public, and if we are living in an age that seems incapable of producing transformative government, we need to recognize that in a democracy it is the people who bear the final responsibility for their fate.
Franklin Roosevelt perhaps put it best when he urged the American people to recognize that “government is ourselves and not an alien power over us. The ultimate rulers of our democracy are not a President and Senators and Congressmen and Government officials but the voters of this country.”
5:00 am, August 28, 2014 Updated: 5:00 am, August 28, 2014
Janice Bowling, a 67-year-old grandmother and Republican state senator from rural Tennessee, thought it only made sense that the city of Tullahoma be able to offer its local high-speed Internet service to areas beyond the city limits.
After all, many of her rural constituents had slow service or did not have access to commercial providers, like AT&T Inc. and Charter Communications Inc.
But a 1999 Tennessee law prohibits cities that operate their own Internet networks from providing access outside the boundaries where they provide electrical service. Bowling wanted to change that and introduced a bill in February to allow them to expand.
She viewed the network, which offers speeds about 80 times faster than AT&T and 10 times faster than Charter in Tullahoma according to advertised services, as a utility, like electricity, that all Tennesseans need.
“We don’t quarrel with the fact that AT&T has shareholders that it has to answer to,” Bowling said with a drawl while sitting in the spacious wood-paneled den of her log-cabin-style home. “That’s fine, and I believe in capitalism and the free market. But when they won’t come in, then Tennesseans have an obligation to do it themselves.”
At a meeting three weeks after Bowling introduced Senate Bill 2562, the state’s three largest telecommunications companies — AT&T, Charter, and Comcast Corp. — tried to convince Republican leaders to relegate the measure to so-called “summer study,” a black hole that effectively kills a bill. Bowling, described as “feisty” by her constituents, initially beat back the effort and thought she’d get a vote.
That’s when Joelle Phillips, president of AT&T’s Tennessee operations, leaned toward her across the table in a conference room next to the House caucus leader’s office and said tersely, “Well, I’d hate for this to end up in litigation,” Bowling recalls.
The threat surprised Bowling, and apparently AT&T’s ominous warning reached her colleagues as well. Days later, support in the Tennessee House for Bowling’s bill dissolved. AT&T had won.
“I had no idea the force that would come against this, because it’s just so reasonable and so necessary,” Bowling said.
AT&T and Phillips didn’t respond to emails asking for comment.
A national fight
Tullahoma is just one battlefront in a nationwide war that the telecommunications giants are fighting against the spread of municipal broadband networks. For more than a decade, AT&T, Comcast, Time Warner Cable Inc., and CenturyLink Inc. have spent millions of dollars to lobby state legislatures, influence state elections and buy research to try to stop the spread of public Internet services that often offer faster speeds at cheaper rates.
The companies have succeeded in getting laws passed in 20 states that ban or restrict municipalities from offering Internet to residents.
Now the fight has gone national. The Federal Communications Commission in Washington, D.C., is considering requests from Chattanooga, Tennessee, and Wilson, North Carolina, to pre-empt state laws that block municipalities from building or expanding broadband networks, hindering economic growth, the cities argue.
If the FCC rules in favor of the cities, and the ruling survives any legal challenges, municipalities nationwide will be free to offer high-speed Internet to residents when they aren’t satisfied with the service provided by private telecommunications companies.
To better understand the municipal broadband debate, the Center for Public Integrity traveled to two southern cities. Tullahoma, which has a broadband network, and Fayetteville, North Carolina, which doesn’t.
City-provided broadband widespread
More than 130 cities from Norwood, Massachusetts, to Clallam County, Washington, currently offer fiber or cable Internet connections to their communities, according to the Institute for Local Self-Reliance, a group that supports municipal broadband. The municipalities are mostly small to mid-sized cities that critics say large Internet providers avoid because the return on investment is too low.
Cities build broadband networks to support businesses, improve health care and education, and attract jobs, they say. About 89 cities offer gigabit speeds, a rate that can download a 4.5 gigabyte movie in 36 seconds. The same file takes an hour at 10 megabits per second. Slower DSL or dial-up connections, which are common in rural areas, would take many hours longer.
Remember when “The Simpsons” was good — about a week ago?
The ongoing “Simpsons” marathon on FXX began with a bang, running through the unsteady early couple of seasons quickly, then getting to hit after hit after hit. It was like living through the 1990s on warp-speed. With its earthy frankness about family life, the challenges of growing up, and the socioeconomic squeeze the American family was undergoing, it was hard to believe that this “Simpsons” had ever been made at all, let alone that it’s technically the same show that’s wrapping up its marathon this weekend and still airs on Fox every Sunday night. If FXX had really wanted to build anticipation and interest for this marathon, they should have run the episodes in reverse order; it would have been an inspiring story of a show that was mediocre for 15 years or so, then suddenly got great.
The complaints about the contemporary “Simpsons” have all been made for years. Primarily, the show, around 2000 or so, abandoned its central conceit slowly, then all at once. The Simpson family’s problems went from trying to stretch the family budget to having too many international vacations to go on or too many other shows to parody in calorie-free jokefests that had nothing to do with anything. The side characters who added color and interest to the show began to take over, with every member of the immediate family having run through every conceivable plot many times over. When episodes weren’t devoted to trips to Brazil or the backstory of Duffman, the Simpsons themselves were warped almost beyond recognition on a whim (Homer became rageful; Marge’s flitting through new fixations like bodybuilding came to look like some sort of personality disorder).
“The Simpsons” went awry for what seem like fairly obvious reasons; it ran out of stories to tell and yet had to keep going. But unlike other shows that had creative lulls and recovered late in their runs — “30 Rock” and “Frasier” come to mind — it’s hard to imagine it recovering. The trend on TV has moved so far away from what, specifically, “The Simpsons” did well, when it did things well. To wit: the five-time Emmy winner for best comedy, “Modern Family,” has placed its family in a world where any material thing is easily obtained. All its adults have lucrative jobs, or don’t need to work at all. As for other consistently successful family series on the dial — well, where are they, really? The only other show as big as “The Simpsons” is “Family Guy,” whose nihilistic attitude and giddy unconcern with material reality “The Simpsons” has aped more and more, leading up to a crossover episode this fall. What is going to push “The Simpsons” to return to depicting the Simpsons as real people when the most successful live-action comedies don’t bother?
It certainly won’t be a push from the audience. Speaking anecdotally, it’d appear that the adult audience has largely left “The Simpsons” behind; its new episodes fail to make the sort of impact among what used to be its core audience than do far-lower-rated shows like “Girls” or “Louie.” And speaking statistically, the show hit its all-time low among adult viewers last season. The teens tuning into Fox’s comedy block, on the other hand, have no expectation that “The Simpsons” be anything more than a mélange of pop culture references tied together by an over-the-top, nearly villainous dad.
It’s not “The Simpsons”‘ fault that it seems unlikely to change — the world is far too hostile to its best elements, now, because everything else on TV learned the wrong lessons from the show’s success. The Simpsons’ lives were fanciful, and so now every show like “The Simpsons” is entirely outlandish. There was a note of sarcasm, and so as though to outdo it, scorched-earth bitterness became the order of the day. Lately, showrunners have been signaling that the show will come to an end soon, even disclosing that an episode that had aired was planned as a series finale. Time is running out. But, really, all the better — it’s time for “Simpsons” fans to accept that at a time when socioeconomic satire is needed more than ever, the entertainment industry is ill-equipped to provide it.
And it’s time for something new on Fox. Perhaps the prime 8 p.m. Sunday night time slot can go to “Bob’s Burgers,” a much-lower-wattage series that has built a fanbase around its very real depictions of a marriage and a family in unglamorous surroundings. The family’s somewhat challenging economic situation — they all work together at a restaurant perpetually one emergency away from closing — isn’t the point of the show, and the emotions in play are far less overt than old “Simpsons’” open, at times gooey, sentiment. It may not be world-beatingly popular. But the expectations that come with mega-popularity, TV fans learned around 2000, can end up destroying a series.
Daniel D’Addario is a staff reporter for Salon’s entertainment section. Follow him on Twitter @DPD_
Director, USC Annenberg Innovation Lab. Producer, “Mean Streets”, “The Last Waltz”, “Until the End Of the World”, “To Die For”
So we are about to embark on a sixteen-week exploration of innovation, entertainment, and the arts. This course is going to be about all three, but I’m going to start with the “art” part — because without the art, no amount of technological innovation or entertainment marketing savvy is going to get you to go to the movie theater. However, I think there’s also a deeper, more controversial claim to be made along these same lines: Without the art, none of the innovation matters — and indeed, it may be impossible — because the art is what gives us vision, and what grounds us to the human element in all of this. Although there will be lectures, during which I’ll do my best to share what I’ve learned about the way innovation, entertainment, and the arts fit together, the most crucial part of the class is the dialogue between us, and specifically the insights coming from you as you teach me about your culture and your ideals. The bottom line is that the world has come a long way, but from my perspective, we’re also living in uniquely worrisome times; my generation had dreams of how to make a better life that have remained woefully unfulfilled (leaving many of us cynical and disillusioned), but at the same time your generation has been saddled with the wreckage of our attempts and are now facing what may seem to be insurmountable odds. I’m writing this letter in the hopes that it will help set the stage for a truly cross-generational dialogue over the next sixteen weeks, in which I help you understand the contexts and choices that have brought us where we are today, and in which you help me, and one another, figure out the best way to move forward from here.
When I was your age, I had my heart broken and my idealism challenged multiple times by the assassinations of my political heroes: namely, John and Bobby Kennedy and Martin Luther King. Many in my generation turned away from politics and found our solace in works of art and entertainment. So one of the things I want to teach you about is a time from 1965–1980 when the artists really ruled both the music and the film industries. Some said “the lunatics had taken over the asylum” (and, amusingly enough, David Geffen named his record company Asylum), but if you look at the quality of work that was produced, it was extraordinary; in fact, most of it is still watched and listened to today. Moreover, in that period the most artistic work also sold the best: The Beatles’ Sgt. Pepper was without doubt the best record of the year but also the best selling, and The Godfather was similarly both best movie of the year and the biggest box office hit. That’s not happening right now, and I want to try to understand why that is. I want to explore, with you, what the implications of this shift might be, and whether this represents a problem. It may be that those fifteen years your parents and I were lucky enough to experience was one of those renaissance moments that only come along once every century, so perhaps it’s asking too much to expect that I’ll see it occur again in my lifetime. Nevertheless, I do hope it happens at least once in yours.
I spoke of the heartbreak of political murder that has permanently marked me and my peers, but we have also been profoundly disappointed by politics’ failure to improve the lives of the average citizen. In 1969, the median salary for a male worker was $35,567 (in 2012 dollars). Today, it is $33,904. So for 44 years, while wages for the top 10% have continued to climb, most Americans have been caught in a “Great Stagnation,” bringing into question the whole purpose of the American capitalist economy (and, along the way, shattering our faith in the “American Dream”). The Reagan-era notion that what benefited the 1% — “the establishment” — would benefit everyone has by now been thoroughly discredited, yet it seems that we are still struggling to pick up the pieces after this failed experiment.
Seen through this lens, the savage partisanship of the current moment makes an odd kind of sense. What were the establishment priorities that moved inexorably forward in both Republican and Democratic administrations? The first was a robust and aggressive foreign policy. As Stephen Kinzer wrote about those in power during the 1950s, “Exceptionalism — the view that the United States has a right to impose its will because it knows more, sees farther, and lives on a higher moral plane than other nations — was to them not a platitude, but the organizing principle of daily life and global politics.”
From Eisenhower to Obama, this principle has been the guiding light of our foreign policy, bringing with it annual defense expenditures that dwarf those of all the world’s major powers combined. The second principle of the establishment was that “what is good for Wall Street is good for America.” Despite Democrats’ efforts to paint the GOP as the party of Wall Street, one would only have to look at the track record of Clinton’s treasury secretaries Rubin and Summers (specifically, their zealous efforts to kill the Glass-Steagal Act and deregulate the big banks and the commodities markets) to see that both major parties are guilty of sucking up to money; apparently, the establishment rules no matter who is in power. Was it any surprise, then, that Obama appointed the architects of bank deregulation, Summers and Geithner, to clean up the mess their policies had caused? Was it any surprise that they failed? Was it any surprise that establishment ideas about the surveillance state were not challenged by Obama? The good news is that, as a nation, we have grown tired of being the world’s unpaid cop, and we are tired of dancing to Wall Street’s tune. Slowly, we are learning that these policies may benefit the 1%, but they don’t benefit the people as a whole. My guess is the 2016 election may be fought on this ground, and we may finally begin to see real change, but the fact remains that we — both your generation and mine — are right now deeply mired in the fallout of unfulfilled promises and the failures of the political system.
So this is the source of boomer disillusionment. But even if we are cynical about political change, we can try to imagine together a future where great artistic work continues to flourish; this, then, is the Innovation and Entertainment part of the course. It’s not that I want you to give up on politics — in fact the events of the last few weeks in Ferguson only reinforce my belief that when people disdain politics, their anger gets channeled into violence. But what I do want you to think about is that art and culture are more plastic — they can be molded and changed easier than politics. There is a sense in which art, politics, and economics are all inextricably and symbiotically tied together, but history has proven to us that art serves as a powerful corrective against the dangers of the establishment. There is a system of checks and balances in which, even though the arts may rely on the social structures afforded by strong economic and political systems, artists can also inspire a culture to move forward, to reject the evils of greed and prejudice, and to reconnect to its human roots. If we are seeking a political and economic change, then, an authentic embrace of the arts may be key. Part of your role as communication scholars is to look more closely at the communication surrounding us and think critically about the effects its having, whose agenda is being promoted, and whether that’s the agenda that will serve us best. One of the tasks we’ll wrestle with in this class will be how we can get the digital fire hose of social media to really support artists, not just brands.
In 2011, the screenwriter Charlie Kaufman (Being John Malkovich, Adaptation) gave a lecture at the British Film Institute. He said something both simple and profound:
People all over the world spend countless hours of their lives every week being fed entertainment in the form of movies, TV shows, newspapers, YouTube videos and the Internet. And it’s ludicrous to believe that this stuff doesn’t alter our brains.
It’s also equally ludicrous to believe that — at the very least — this mass distraction and manipulation is not convenient for the people who are in charge. People are starving. They may not know it because they’re being fed mass produced garbage. The packaging is colorful and loud, but it’s produced in the same factories that make Pop Tarts and iPads, by people sitting around thinking, “What can we do to get people to buy more of these?
And they’re very good at their jobs. But that’s what it is you’re getting, because that’s what they’re making. They’re selling you something. And the world is built on this now. Politics and government are built on this, corporations are built on this. Interpersonal relationships are built on this. And we’re starving, all of us, and we’re killing each other, and we’re hating each other, and we’re calling each other liars and evil because it’s all become marketing and we want to win because we’re lonely and empty and scared and we’re led to believe winning will change all that. But there is no winning.
I think Charlie is right. People are starving, so we give them bread and circuses.
But I think Charlie is wrong when he says “there is no winning”. In fact I think we are really in a “winner-take-all” society. Look at the digital pop charts. 80% of the music streams are for 1% of the content. That means that Jay-Z and Beyoncé are billionaires, but the average musician can barely make a living. Bob Dylan’s first album only sold 4,000 copies. In this day and age, he would have been dropped by his label before he created his greatest work.
A writer I greatly admired, Gabriel García Márquez, died recently. For me, Márquez embodied the role of the artist in society, marked by the refusal to believe that we are incapable of creating a more just world. Utopias are out of favor now. Yet Marquez never gave up believing in the transformational power of words to conjure magic and seize the imagination. The other crucial aspect of Márquez’s work is that he teaches us the importance of regionalism. In a commercial culture of sameness where you can stroll through a mall in Shanghai and forget that you’re not in Los Angeles, Marquez’s work was distinctly Latin American. His work was as unique as the songs of Gilberto Gil, or the cinema of Alejandro González Iñárritu. In a cultural like ours that has so long advocated a “melting pot” philosophy that papers over our differences, it is valuable to recognize that there is a difference between allowing our differences to serve as barriers and appreciating the things that make each culture unique, situated in time and space and connected to its people. What’s more, young artists also need to have the sense of history that Marquez celebrated when he said, “I cannot imagine how anyone could even think of writing a novel without having at least a vague of idea of the 10,000 years of literature that have gone before.” Cultural amnesia only leads to cultural death.
With these values in mind, my hope is to lead you in a discussion of politics and culture in the context of 250 years of America’s somewhat utopian battle to build “a city on a hill.” I think many in my generation had this utopian impulse (which is, it should be observed, different than idealism), but it is slipping away like a short-term memory. I did not aspire to be that professor who quotes Dr. King, but I feel I must. He said the night before he was assassinated, “I may not get there with you, but I believe in the promised land.” My generation knew that the road towards a better society would be long, but we hoped our children’s children might live in that land, even if we weren’t able to get there with you. It may take even longer than we imagined, but I know your generation believes in justice and equality, and that fills me with hope that the dream of some sort of promised land is not wholly lost. The next step, then, is to figure out how to work together, to learn from the past while living in the present moment in order to secure a better future, and I believe this class offers us an incredible opportunity to do precisely that.
So what are the skills that we can develop together in order to open a real cross-generational dialogue? First, I would hope we would learn to improvise. I want you to challenge me, just as I encourage and challenge you. Improvisation means sometimes throwing away your notes and just responding from your gut to the ideas being presented. It takes both courage and intelligence, but I’m pretty sure you have deep stores of both qualities, which will help you show leadership both in class and throughout the rest of your life. Leadership is more than just bravery and intellect, however; it also requires vulnerability and compassion, skills that I hope we can similarly cultivate together. I want you to know that I don’t have all the answers — and, more importantly, I know that I don’t have all the answers. I am somewhat confused by our current culture and I am looking to you for insight. You need to have that same vulnerability with your peers, and you also need to treat them with compassion as you struggle together to understand this new world of disruption. I know these four elements — courage, intelligence, vulnerability, and compassion — may seem like they are working at cross-purposes, but we will need all four qualities if we are to take on the two tasks before us. One of our tasks is to try to restore a sense of excellence in our culture — the belief that great art and entertainment can also be popular. The second task is for baby boomer parents and their millennial children to form a natural political alliance going forward. As I’ve said, I don’t think the notion that we will get to “the promised land” is totally dead, and with your energy and the tools of the new media ecosystem to help us organize, we can keep working towards a newly hopeful society, culture, and economy, in spite of the mess we have left you with.
This is, at least, the plan. Of course, as the great critic James Agee once said, “Performance, in which the whole fate and terror rests, is another matter.”
I’m enough of a distraction addict that a low-level ambient guilt about not getting my real work done hovers around me for most of the day. And this distractible quality in me pervades every part of my life. The distractions—What am I making for dinner?, Who was that woman in “Fargo”?, or, quite commonly, What else should I be reading?—are invariably things that can wait. What, I wonder, would I be capable of doing if I weren’t constantly worrying about what I ought to be doing?
And who is this frumpy thirty-something man who has tried to read “War and Peace” five times, never making it past the garden gate? I took the tome down from the shelf this morning and frowned again at those sad little dog-ears near the fifty-page mark.
Are the luxuries of time on which deep reading is reliant available to us anymore? Even the attention we deign to give to our distractions, those frissons, is narrowing.
It’s important to note this slippage. As a child, I would read for hours in bed without the possibility of a single digital interruption. Even the phone (which was anchored by wires to the kitchen wall downstairs) was generally mute after dinner. Our two hours of permitted television would come to an end, and I would seek out the solitary refuge of a novel. And deep reading (as opposed to reading a Tumblr feed) was a true refuge. What I liked best about that absorbing act was the fact books became a world unto themselves, one that I (an otherwise powerless kid) had some control over. There was a childish pleasure in holding the mysterious object in my hands; in preparing for the story’s finale by monitoring what Austen called a “tell-tale compression of the pages”; in proceeding through some perfect sequence of plot points that bested by far the awkward happenstance of real life.
The physical book, held, knowable, became a small mental apartment I could have dominion over, something that was alive because of my attention and then lived in me.
But now . . . that thankful retreat, where my child-self could become so lost, seems unavailable to me. Today there is no room in my house, no block in my city, where I am unreachable.
Eventually, if we start giving them a chance, moments of absence reappear, and we can pick them up if we like. One appeared this morning, when my partner flew to Paris. He’ll be gone for two weeks. I’ll miss him, but this is also my big break.
I’ve taken “War and Peace” back down off the shelf. It’s sitting beside my computer as I write these lines—accusatory as some attention-starved pet.
You and me, old friend. You, me, and two weeks. I open the book, I shut the book, and I open the book again. The ink swirls up at me. This is hard. Why is this so hard?
* * *
Dr. Douglas Gentile, a friendly professor at Iowa State University, recently commiserated with me about my pathetic attention span. “It’s me, too, of course,” he said. “When I try to write a paper, I can’t keep from checking my e-mail every five minutes. Even though I know it’s actually making me less productive.” This failing is especially worrying for Gentile because he happens to be one of the world’s leading authorities on the effects of media on the brains of the young. “I know, I know! I know all the research on multitasking. I can tell you absolutely that everyone who thinks they’re good at multitasking is wrong. We know that in fact it’s those who think they’re good at multitasking who are the least productive when they multitask.”
The brain itself is not, whatever we may like to believe, a multitasking device. And that is where our problem begins. Your brain does a certain amount of parallel processing in order to synthesize auditory and visual information into a single understanding of the world around you, but the brain’s attention is itself only a spotlight, capable of shining on one thing at a time. So the very word multitask is a misnomer. There is rapid-shifting minitasking, there is lame-spasms-of-effort-tasking, but there is, alas, no such thing as multitasking. “When we think we’re multitasking,” says Gentile, “we’re actually multiswitching.”
We can hardly blame ourselves for being enraptured by the promise of multitasking, though. Computers—like televisions before them—tap into a very basic brain function called an “orienting response.” Orienting responses served us well in the wilderness of our species’ early years. When the light changes in your peripheral vision, you must look at it because that could be the shadow of something that’s about to eat you. If a twig snaps behind you, ditto. Having evolved in an environment rife with danger and uncertainty, we are hardwired to always default to fast-paced shifts in focus. Orienting responses are the brain’s ever-armed alarm system and cannot be ignored.
Gentile believes it’s time for a renaissance in our understanding of mental health. To begin with, just as we can’t accept our body’s cravings for chocolate cake at face value, neither can we any longer afford to indulge the automatic desires our brains harbor for distraction.
* * *
It’s not merely difficult at first. It’s torture. I slump into the book, reread sentences, entire paragraphs. I get through two pages and then stop to check my e-mail—and down the rabbit hole I go. After all, one does not read “War and Peace” so much as suffer through it. It doesn’t help that the world at large, being so divorced from such pursuits, is often aggressive toward those who drop away into single-subject attention wells. People don’t like it when you read “War and Peace.” It’s too long, too boring, not worth the effort. And you’re elitist for trying.
In order to finish the thing in the two weeks I have allotted myself, I must read one hundred pages each day without fail. If something distracts me from my day’s reading—a friend in the hospital, a magazine assignment, sunshine—I must read two hundred pages on the following day. I’ve read at this pace before, in my university days, but that was years ago and I’ve been steadily down-training my brain ever since.
* * *
Another week has passed—my “War and Peace” struggle continues. I’ve realized now that the subject of my distraction is far more likely to be something I need to look at than something I need to do. There have always been activities—dishes, gardening, sex, shopping—that derail whatever purpose we’ve assigned to ourselves on a given day. What’s different now is the addition of so much content that we passively consume.
Only this morning I watched a boy break down crying on “X Factor,” then regain his courage and belt out a half-decent rendition of Beyoncé’s “Listen”; next I looked up the original Beyoncé video and played it twice while reading the first few paragraphs of a story about the humanity of child soldiers; then I switched to a Nina Simone playlist prepared for me by Songza, which played while I flipped through a slide show of American soldiers seeing their dogs for the first time in years; and so on, ad nauseam. Until I shook I out of this funk and tried to remember what I’d sat down to work on in the first place.
* * *
If I’m to break from our culture of distraction, I’m going to need practical advice, not just depressing statistics. To that end, I switch gears and decide to stop talking to scientists for a while; I need to talk to someone who deals with attention and productivity in the so-called real world. Someone with a big smile and tailored suits such as organizational guru Peter Bregman. He runs a global consulting firm that gets CEOs to unleash the potential of their workers, and he’s also the author of the acclaimed business book 18 Minutes, which counsels readers to take a minute out of every work hour (plus five minutes at the start and end of the day) to do nothing but set an intention.
Bregman told me he sets his watch to beep every hour as a reminder that it’s time to right his course again. Aside from the intention setting, Bregman counsels no more than three e-mail check-ins a day. This notion of batch processing was anathema to someone like me, used to checking my in-box so constantly, particularly when my work feels stuck. “It’s incredibly inefficient to switch back and forth,” said Bregman, echoing every scientist I’d spoken to on multitasking. “Besides, e-mail is, actually, just about the least efficient mode of conversation you can have. And what we know about multitasking is that, frankly, you can’t. You just derail.”
“I just always feel I’m missing something important,” I said. “And that’s precisely why we lose hours every day, that fear.” Bregman argues that it’s people who can get ahead of that fear who end up excelling in the business world that he spends his own days in. “I think everyone is more distractible today than we used to be. It’s a very hard thing to fix. And as people become more distracted, we know they’re actually doing less, getting less done. Your efforts just leak out. And those who aren’t—aren’t leaking—are going to be the most successful.”
I hate that I leak. But there’s a religious certainty required in order to devote yourself to one thing while cutting off the rest of the world. We don’t know that the inbox is emergency-free, we don’t know that the work we’re doing is the work we ought to be doing. But we can’t move forward in a sane way without having some faith in the moment we’ve committed to. “You need to decide that things don’t matter as much as you might think they matter,” Bregman suggested as I told him about my flitting ways. And that made me think there might be a connection between the responsibility-free days of my youth and that earlier self’s ability to concentrate. My young self had nowhere else to be, no permanent anxiety nagging at his conscience. Could I return to that sense of ease? Could I simply be where I was and not seek out a shifting plurality to fill up my time?
* * *
It happened softly and without my really noticing.
As I wore a deeper groove into the cushions of my sofa, so the book I was holding wore a groove into my (equally soft) mind. Moments of total absence began to take hold more often; I remembered what it was like to be lost entirely in a well-spun narrative. There was the scene where Anna Mikhailovna begs so pitifully for a little money, hoping to send her son to war properly dressed. And there were, increasingly, more like it. More moments where the world around me dropped away and I was properly absorbed. A “causeless springtime feeling of joy” overtakes Prince Andrei; a tearful Pierre sees in a comet his last shimmering hope; Emperor Napoleon takes his troops into the heart of Russia, oblivious to the coming winter that will destroy them all…
It takes a week or so for withdrawal symptoms to work through a heroin addict’s body. While I wouldn’t pretend to compare severity here, doubtless we need patience, too, when we deprive ourselves of the manic digital distractions we’ve grown addicted to.
That’s how it was with my Tolstoy and me. The periods without distraction grew longer, I settled into the sofa and couldn’t hear the phone, couldn’t hear the ghost-buzz of something else to do. I’m teaching myself to slip away from the world again.
* * *
Yesterday I fell asleep on the sofa with a few dozen pages of “War and Peace” to go. I could hear my cell phone buzzing from its perch on top of the piano. I saw the glowing green eye of my Cyclops modem as it broadcast potential distraction all around. But on I went past the turgid military campaigns and past the fretting of Russian princesses, until sleep finally claimed me and my head, exhausted, dreamed of nothing at all. This morning I finished the thing at last. The clean edges of its thirteen hundred pages have been ruffled down into a paper cabbage, the cover is pilled from the time I dropped it in the bath. Holding the thing aloft, trophy style, I notice the book is slightly larger than it was before I read it.
It’s only after the book is laid down, and I’ve quietly showered and shaved, that I realize I haven’t checked my e-mail today. The thought of that duty comes down on me like an anvil.
Instead, I lie back on the sofa and think some more about my favorite reader Milton – about his own anxieties around reading. By the mid-1650s, he had suffered that larger removal from the crowds, he had lost his vision entirely and could not read at all—at least not with his own eyes. From within this new solitude, he worried that he could no longer meet his potential. One sonnet, written shortly after the loss of his vision, begins:
When I consider how my light is spent,
Ere half my days, in this dark world and wide, and that one Talent
which is death to hide Lodged with me useless . . .
Yet from that position, in the greatest of caves, he began producing his greatest work. The epic “Paradise Lost,” a totemic feat of concentration, was dictated to aides, including his three daughters.
Milton already knew, after all, the great value in removing himself from the rush of the world, so perhaps those anxieties around his blindness never had a hope of dominating his mind. I, on the other hand, and all my peers, must make a constant study of concentration itself. I slot my ragged “War and Peace” back on the shelf. It left its marks on me the same way I left my marks on it (I feel awake as a man dragged across barnacles on the bottom of some ocean). I think: This is where I was most alive, most happy. How did I go from loving that absence to being tortured by it? How can I learn to love that absence again?
This essay is adapted from “The End of Absence” by Michael Harris, published by Current / Penguin Random House.