In a Labor Day weekend mood, I watched “Grapes of Wrath” again this evening. Labor Day is, after all, a celebration of the American labor movement and is dedicated to the social and economic achievements of workers. “Grapes of Wrath” portrays familiar themes in the American worker experience: be it displaced farmers from Oklahoma to baristas and Twitter people with degrees, there is a continual struggle between workers and those with wealth desiring cheap, easily manipulated, labor.
The wealthy pretty much got their way in the States until the Depression (rich people gambling to get richer) fueled the re- balancing of the worker/owner relationship — more in favor of the worker– under FDR, and his New Deal. This balance, which was great for the overall health of the country, continued through LBJ and the Great Society. Now things are going the other way, with the wealthy neoliberal controller classes producing a political and economic system that assures their success no matter which of the two political parties wins. Reagan, Clinton and now Obama dismantled the Great Society, fought to break the worker unions, and deregulated banking and other entities once deemed “public trusts.” The resultant series of economic crises and bursting bubbles destroyed the working and middle classes and threatens to remove whats left of the social safety nets.
Tom Joad’s famous final speech (excerpts below) to his Ma in the movie “Grapes of Wrath” powerfully expressed the thoughts and yearnings of the Depression-period worker and resonates with the increasingly disenfranchised workers of today. The American revolutionary, Tom Joad, espousing collective action that creates change, is a familiar subplot in the American drama. What distresses me about this speech is Tom’s dream to spread wealth more justly “…if all our folks got together and yelled…”. In this 21st century people yell for a few months (Occupy) and the illusion and control by the owners returns. In the age of the “meh generation” and Ayn Rand the notion of a collective soul is anathema.
Tom Joad: I been thinking about us, too, about our people living like pigs and good rich land layin’ fallow. Or maybe one guy with a million acres and a hundred thousand farmers starvin’. And I been wonderin’ if all our folks got together and yelled…
Ma Joad: Tommy, you’re not aimin’ to kill nobody.
Tom Joad: No, Ma, not that. That ain’t it. It’s just, well as long as I’m an outlaw anyways… maybe I can do somethin’… maybe I can just find out somethin’, just scrounge around and maybe find out what it is that’s wrong and see if they ain’t somethin’ that can be done about it. I ain’t thought it out all clear, Ma. I can’t. I don’t know enough.
Ma Joad: How am I gonna know about ya, Tommy? Why they could kill ya and I’d never know. They could hurt ya. How am I gonna know?
Tom Joad: Well, maybe it’s like Casy says. A fellow ain’t got a soul of his own, just little piece of a big soul, the one big soul that belongs to everybody, then…
Ma Joad: Then what, Tom?
Tom Joad: Then it don’t matter. I’ll be all around in the dark – I’ll be everywhere. Wherever you can look – wherever there’s a fight, so hungry people can eat, I’ll be there. Wherever there’s a cop beatin’ up a guy, I’ll be there. I’ll be in the way guys yell when they’re mad. I’ll be in the way kids laugh when they’re hungry and they know supper’s ready, and when the people are eatin’ the stuff they raise and livin’ in the houses they build – I’ll be there, too.
Observers going back to Aristotle have noted that nervous dyspepsia and intellectual accomplishment often go hand in hand. Sigmund Freud’s trip to the United States in 1909, which introduced psychoanalysis to this country, was marred (as he would later frequently complain) by his nervous stomach and bouts of diarrhea. Many of the letters between William and Henry James, first-class neurotics both, consist mainly of the exchange of various remedies for their stomach trouble.
But for debilitating nervous stomach complaints, nothing compares to that which afflicted poor Charles Darwin, who spent decades of his life prostrated by his upset stomach.
That affliction of afflictions, Stossel argues, was Darwin’s overpowering anxiety – something that might explain why his influential studies of human emotion were of such intense interest to him. Stossel points to a “Diary of Health” that the scientist kept for six years between the ages of 40 and 46 at the urging of his physician. He filled dozens of pages with complaints like “chronic fatigue, severe stomach pain and flatulence, frequent vomiting, dizziness (‘swimming head,’ as Darwin described it), trembling, insomnia, rashes, eczema, boils, heart palpitations and pain, and melancholy.”
In 1865 – six years after the completion of The Origin of Species – a distraught 56-year-old Darwin wrote a letter to another physician, John Chapman, outlining the multitude of symptoms that had bedeviled him for decades:
For 25 years extreme spasmodic daily & nightly flatulence: occasional vomiting, on two occasions prolonged during months. Vomiting preceded by shivering, hysterical crying[,] dying sensations or half-faint. & copious very palid urine. Now vomiting & every passage of flatulence preceded by ringing of ears, treading on air & vision …. Nervousness when E leaves me.
“E” refers to his wife Emma, who loved Darwin dearly and who mothered his ten children – a context in which his “nervousness” does suggest anxiety’s characteristic tendency to wring worries out of unlikely scenarios, not to mention being direct evidence of the very term “separation anxiety.”
Darwin was frustrated that dozens of physicians, beginning with his own father, had failed to cure him. By the time he wrote to Dr. Chapman, Darwin had spent most of the past three decades – during which time he’d struggled heroically to write On the Origin of Species housebound by general invalidism. Based on his diaries and letters, it’s fair to say he spent a full third of his daytime hours since the age of twenty-eight either vomiting or lying in bed.
Chapman had treated many prominent Victorian intellectuals who were “knocked up” with anxiety at one time or another; he specialized in, as he put it, those high-strung neurotics “whose minds are highly cultivated and developed, and often complicated, modified, and dominated by subtle psychical conflicts, whose intensity and bearing on the physical malady it is difficult to comprehend.” He prescribed the application of ice to the spinal cord for almost all diseases of nervous origin.
Chapman came out to Darwin’s country estate in late May 1865, and Darwin spent several hours each day over the next several months encased in ice; he composed crucial sections of The Variation of Animals and Plants Under Domestication with ice bags packed around his spine.
The treatment didn’t work. The “incessant vomiting” continued. So while Darwin and his family enjoyed Chapman’s company (“We liked Dr. Chapman so very much we were quite sorry the ice failed for his sake as well as ours” Darwin’s wife wrote), by July they had abandoned the treatment and sent the doctor back to London.
Chapman was not the first doctor to fail to cure Darwin, and he would not be the last. To read Darwin’s diaries and correspondence is to marvel at the more or less constant debilitation he endured after he returned from the famous voyage of the Beagle in 1836. The medical debate about what, exactly, was wrong with Darwin has raged for 150 years. The list proposed during his life and after his death is long: amoebic infection, appendicitis, duodenal ulcer, peptic ulcer, migraines, chronic cholecystitis, “smouldering hepatitis,” malaria, catarrhal dyspepsia, arsenic poisoning, porphyria, narcolepsy, “diabetogenic hyper-insulism,” gout, “suppressed gout,” chronic brucellosis (endemic to Argentina, which the Beagle had visited), Chagas’ disease (possibly contracted from a bug bite in Argentina), allergic reactions to the pigeons he worked with, complications from the protracted seasickness he experienced on the Beagle, and ‘refractive anomaly of the eyes.’ I’ve just read an article, “Darwin’s Illness Revealed,” published in a British academic journal in 2005, that attributes Darwin’s ailments to lactose intolerance.
Various competing hypotheses attempted to diagnose Darwin, both during his lifetime and after. But Stossel argues that “a careful reading of Darwin’s life suggests that the precipitating factor in every one of his most acute attacks of illness was anxiety.” His greatest rebuttal to other medical theories is a seemingly simple, positively profound piece of evidence:
When Darwin would stop working and go walking or riding in the Scottish Highlands or North Wales, his health would be restored.
(Of course, one need not suffer from debilitating anxiety in order to reap the physical and mental benefits of walking, arguably one of the simplest yet most rewarding forms of psychic restoration and a powerful catalyst for creativity.)
Director, USC Annenberg Innovation Lab. Producer, “Mean Streets”, “The Last Waltz”, “Until the End Of the World”, “To Die For”
So we are about to embark on a sixteen-week exploration of innovation, entertainment, and the arts. This course is going to be about all three, but I’m going to start with the “art” part — because without the art, no amount of technological innovation or entertainment marketing savvy is going to get you to go to the movie theater. However, I think there’s also a deeper, more controversial claim to be made along these same lines: Without the art, none of the innovation matters — and indeed, it may be impossible — because the art is what gives us vision, and what grounds us to the human element in all of this. Although there will be lectures, during which I’ll do my best to share what I’ve learned about the way innovation, entertainment, and the arts fit together, the most crucial part of the class is the dialogue between us, and specifically the insights coming from you as you teach me about your culture and your ideals. The bottom line is that the world has come a long way, but from my perspective, we’re also living in uniquely worrisome times; my generation had dreams of how to make a better life that have remained woefully unfulfilled (leaving many of us cynical and disillusioned), but at the same time your generation has been saddled with the wreckage of our attempts and are now facing what may seem to be insurmountable odds. I’m writing this letter in the hopes that it will help set the stage for a truly cross-generational dialogue over the next sixteen weeks, in which I help you understand the contexts and choices that have brought us where we are today, and in which you help me, and one another, figure out the best way to move forward from here.
When I was your age, I had my heart broken and my idealism challenged multiple times by the assassinations of my political heroes: namely, John and Bobby Kennedy and Martin Luther King. Many in my generation turned away from politics and found our solace in works of art and entertainment. So one of the things I want to teach you about is a time from 1965–1980 when the artists really ruled both the music and the film industries. Some said “the lunatics had taken over the asylum” (and, amusingly enough, David Geffen named his record company Asylum), but if you look at the quality of work that was produced, it was extraordinary; in fact, most of it is still watched and listened to today. Moreover, in that period the most artistic work also sold the best: The Beatles’ Sgt. Pepper was without doubt the best record of the year but also the best selling, and The Godfather was similarly both best movie of the year and the biggest box office hit. That’s not happening right now, and I want to try to understand why that is. I want to explore, with you, what the implications of this shift might be, and whether this represents a problem. It may be that those fifteen years your parents and I were lucky enough to experience was one of those renaissance moments that only come along once every century, so perhaps it’s asking too much to expect that I’ll see it occur again in my lifetime. Nevertheless, I do hope it happens at least once in yours.
I spoke of the heartbreak of political murder that has permanently marked me and my peers, but we have also been profoundly disappointed by politics’ failure to improve the lives of the average citizen. In 1969, the median salary for a male worker was $35,567 (in 2012 dollars). Today, it is $33,904. So for 44 years, while wages for the top 10% have continued to climb, most Americans have been caught in a “Great Stagnation,” bringing into question the whole purpose of the American capitalist economy (and, along the way, shattering our faith in the “American Dream”). The Reagan-era notion that what benefited the 1% — “the establishment” — would benefit everyone has by now been thoroughly discredited, yet it seems that we are still struggling to pick up the pieces after this failed experiment.
Seen through this lens, the savage partisanship of the current moment makes an odd kind of sense. What were the establishment priorities that moved inexorably forward in both Republican and Democratic administrations? The first was a robust and aggressive foreign policy. As Stephen Kinzer wrote about those in power during the 1950s, “Exceptionalism — the view that the United States has a right to impose its will because it knows more, sees farther, and lives on a higher moral plane than other nations — was to them not a platitude, but the organizing principle of daily life and global politics.”
From Eisenhower to Obama, this principle has been the guiding light of our foreign policy, bringing with it annual defense expenditures that dwarf those of all the world’s major powers combined. The second principle of the establishment was that “what is good for Wall Street is good for America.” Despite Democrats’ efforts to paint the GOP as the party of Wall Street, one would only have to look at the track record of Clinton’s treasury secretaries Rubin and Summers (specifically, their zealous efforts to kill the Glass-Steagal Act and deregulate the big banks and the commodities markets) to see that both major parties are guilty of sucking up to money; apparently, the establishment rules no matter who is in power. Was it any surprise, then, that Obama appointed the architects of bank deregulation, Summers and Geithner, to clean up the mess their policies had caused? Was it any surprise that they failed? Was it any surprise that establishment ideas about the surveillance state were not challenged by Obama? The good news is that, as a nation, we have grown tired of being the world’s unpaid cop, and we are tired of dancing to Wall Street’s tune. Slowly, we are learning that these policies may benefit the 1%, but they don’t benefit the people as a whole. My guess is the 2016 election may be fought on this ground, and we may finally begin to see real change, but the fact remains that we — both your generation and mine — are right now deeply mired in the fallout of unfulfilled promises and the failures of the political system.
So this is the source of boomer disillusionment. But even if we are cynical about political change, we can try to imagine together a future where great artistic work continues to flourish; this, then, is the Innovation and Entertainment part of the course. It’s not that I want you to give up on politics — in fact the events of the last few weeks in Ferguson only reinforce my belief that when people disdain politics, their anger gets channeled into violence. But what I do want you to think about is that art and culture are more plastic — they can be molded and changed easier than politics. There is a sense in which art, politics, and economics are all inextricably and symbiotically tied together, but history has proven to us that art serves as a powerful corrective against the dangers of the establishment. There is a system of checks and balances in which, even though the arts may rely on the social structures afforded by strong economic and political systems, artists can also inspire a culture to move forward, to reject the evils of greed and prejudice, and to reconnect to its human roots. If we are seeking a political and economic change, then, an authentic embrace of the arts may be key. Part of your role as communication scholars is to look more closely at the communication surrounding us and think critically about the effects its having, whose agenda is being promoted, and whether that’s the agenda that will serve us best. One of the tasks we’ll wrestle with in this class will be how we can get the digital fire hose of social media to really support artists, not just brands.
In 2011, the screenwriter Charlie Kaufman (Being John Malkovich, Adaptation) gave a lecture at the British Film Institute. He said something both simple and profound:
People all over the world spend countless hours of their lives every week being fed entertainment in the form of movies, TV shows, newspapers, YouTube videos and the Internet. And it’s ludicrous to believe that this stuff doesn’t alter our brains.
It’s also equally ludicrous to believe that — at the very least — this mass distraction and manipulation is not convenient for the people who are in charge. People are starving. They may not know it because they’re being fed mass produced garbage. The packaging is colorful and loud, but it’s produced in the same factories that make Pop Tarts and iPads, by people sitting around thinking, “What can we do to get people to buy more of these?
And they’re very good at their jobs. But that’s what it is you’re getting, because that’s what they’re making. They’re selling you something. And the world is built on this now. Politics and government are built on this, corporations are built on this. Interpersonal relationships are built on this. And we’re starving, all of us, and we’re killing each other, and we’re hating each other, and we’re calling each other liars and evil because it’s all become marketing and we want to win because we’re lonely and empty and scared and we’re led to believe winning will change all that. But there is no winning.
I think Charlie is right. People are starving, so we give them bread and circuses.
But I think Charlie is wrong when he says “there is no winning”. In fact I think we are really in a “winner-take-all” society. Look at the digital pop charts. 80% of the music streams are for 1% of the content. That means that Jay-Z and Beyoncé are billionaires, but the average musician can barely make a living. Bob Dylan’s first album only sold 4,000 copies. In this day and age, he would have been dropped by his label before he created his greatest work.
A writer I greatly admired, Gabriel García Márquez, died recently. For me, Márquez embodied the role of the artist in society, marked by the refusal to believe that we are incapable of creating a more just world. Utopias are out of favor now. Yet Marquez never gave up believing in the transformational power of words to conjure magic and seize the imagination. The other crucial aspect of Márquez’s work is that he teaches us the importance of regionalism. In a commercial culture of sameness where you can stroll through a mall in Shanghai and forget that you’re not in Los Angeles, Marquez’s work was distinctly Latin American. His work was as unique as the songs of Gilberto Gil, or the cinema of Alejandro González Iñárritu. In a cultural like ours that has so long advocated a “melting pot” philosophy that papers over our differences, it is valuable to recognize that there is a difference between allowing our differences to serve as barriers and appreciating the things that make each culture unique, situated in time and space and connected to its people. What’s more, young artists also need to have the sense of history that Marquez celebrated when he said, “I cannot imagine how anyone could even think of writing a novel without having at least a vague of idea of the 10,000 years of literature that have gone before.” Cultural amnesia only leads to cultural death.
With these values in mind, my hope is to lead you in a discussion of politics and culture in the context of 250 years of America’s somewhat utopian battle to build “a city on a hill.” I think many in my generation had this utopian impulse (which is, it should be observed, different than idealism), but it is slipping away like a short-term memory. I did not aspire to be that professor who quotes Dr. King, but I feel I must. He said the night before he was assassinated, “I may not get there with you, but I believe in the promised land.” My generation knew that the road towards a better society would be long, but we hoped our children’s children might live in that land, even if we weren’t able to get there with you. It may take even longer than we imagined, but I know your generation believes in justice and equality, and that fills me with hope that the dream of some sort of promised land is not wholly lost. The next step, then, is to figure out how to work together, to learn from the past while living in the present moment in order to secure a better future, and I believe this class offers us an incredible opportunity to do precisely that.
So what are the skills that we can develop together in order to open a real cross-generational dialogue? First, I would hope we would learn to improvise. I want you to challenge me, just as I encourage and challenge you. Improvisation means sometimes throwing away your notes and just responding from your gut to the ideas being presented. It takes both courage and intelligence, but I’m pretty sure you have deep stores of both qualities, which will help you show leadership both in class and throughout the rest of your life. Leadership is more than just bravery and intellect, however; it also requires vulnerability and compassion, skills that I hope we can similarly cultivate together. I want you to know that I don’t have all the answers — and, more importantly, I know that I don’t have all the answers. I am somewhat confused by our current culture and I am looking to you for insight. You need to have that same vulnerability with your peers, and you also need to treat them with compassion as you struggle together to understand this new world of disruption. I know these four elements — courage, intelligence, vulnerability, and compassion — may seem like they are working at cross-purposes, but we will need all four qualities if we are to take on the two tasks before us. One of our tasks is to try to restore a sense of excellence in our culture — the belief that great art and entertainment can also be popular. The second task is for baby boomer parents and their millennial children to form a natural political alliance going forward. As I’ve said, I don’t think the notion that we will get to “the promised land” is totally dead, and with your energy and the tools of the new media ecosystem to help us organize, we can keep working towards a newly hopeful society, culture, and economy, in spite of the mess we have left you with.
This is, at least, the plan. Of course, as the great critic James Agee once said, “Performance, in which the whole fate and terror rests, is another matter.”
“What should we do? Mourn? Grieve? Plead?”, the Chief asked the Priest, desperate.
“No”, said the Priest, looking at the sky, afraid.
Here’s a tiny question. Are we idearupt? As in: bankrupt of great ideas?
Go ahead. Name me an “ism” that still works.
Conservatism? #LOL. Liberalism? #lol. Capitalism, or what’s left of it? Sure, maybe for billionaires. “Libertarianism”? I invite you to Mogadishu, good sir. Socialism…syndicalism…anarchism…mercantilism…revanchism…shit!!
Wait. What about…Bronyism?
Perhaps you see my point.
We’re living through a kind of implosion. Not just of institutions—that much is obvious. But a collapse of institutions that was detonated by an implosion.
Yesterday’s ideas about how to organize societies and economies simply don’t work anymore.
And so we’re left in a vacuum. What’s a vacuum? A void. An emptiness. An absence. We’re out of good ideas about how societies, democracies, and economies should be organized and managed.
But not just “how”. More deeply, by whom—and why.
What’s the point, you often wonder. Of your life. Of the sheer goddamned futility of it all.
Working harder on stuff that doesn’t matter to buy junk you can’t afford to impress people you don’t like obeying the orders of robots programmed by assholes who’ve never read a book in their lives that oversee the entire economy purely for the production of “profit” not real things that actually benefit human lives which are getting poorer so they’re just one paycheck away from disaster…and even if you do somehow win the infernal contest of all the above, what’s the jackpot at the end of the rainbow? A life that’s totally meaningless in the first place.
What the fuck?
If you think all that’s…futile…you’re not wrong. You’re precisely right. It is. Yesterday’s great “isms” do not offer enough, to enough, for enough, from enough.
Whether it is “liberalism” or “conservatism”, the result is the same.
The middle class implodes; the rich grow incalculably richer; the poor are trampled. What’s the result? To pay for social services, the assets of the state are “privatized“; but they cannot do so for long. Eventually, ninety percent plus of people see their incomes stagnate; their wealth vanish; economies stall as people grow poorer. Society can no longer afford public goods, as tax bases dry up; public and private debts grow; and currencies are devalued. People’s lives go from prosperous and stable to precarious and impoverished in a generation or two.
See the pattern? The collapse of great ideas about to organize stuff isn’t merely…an idea. It’s reality.
Consider the twentieth century. The world created international law, international development, international trade, and international human rights. These were tremendous, astonishing human accomplishments. The kind that mankind might never have even dreamed of a few short centuries ago.
And now? What do we consider “great ideas”? Cruising to your less-than-minimum-wage temp gig at a robo-warehouse in your self-driving car share checking how many “friends” Spot made on the latest doggy dating app hoping you got another heart on yours?
Those aren’t great ideas. They’re clever businesses, and for that we should applaud them. But we must recognize. You can’t Tinder your way to a better world. You can’t even Tinder your way to a life worth living.
All the great “isms” are winking out. And so. The world is starting to burn. Nations are fracturing. Social contracts are being torn apart. In most of the world’s richest nations, not one but two generations will be lost. The global economy is stagnating.
And already from that witches cauldron is rising the smoke. Of violence, animosity, extremism, hatred. Which will eventually, if the fire is left untended, kindle into a wildfire of war.
All this is not inevitable. Yet. But it is predictable. For a single, simple reason.
We no longer have ideas powerful enough to organize the world. Yesterday’s “isms” are vanishing. And in their place is left a vacuum.
Here’s the catch.
You probably believe that something always fills a vacuum. For you’ve been trained to be an obedient believer in progress; in advancement; in growth; in efficiency; in spontaneous order; in self-organization; in automaticity; in manifest destiny; and in all that’s inevitability.
In other words, you’re a True Believer in…the Big Idea: the idea of the progress of ideas.
Something always fills a vacuum, right? A bigger, better idea?
Sometimes, nothing does. For a very long while.
Sometimes, there is no progress of ideas.
Sometimes the darkness stays. And lasts. And deepens. Into an endless, frozen midnight. An abyss of collapsing ideas; from which mankind must escape.
We call those times Dark Ages. And my worry is that we’re stumbling headlong into one.
“The Gods have died!”, the villagers cried.
“What should we do? Mourn? Grieve? Plead?”, the Chief asked the Priest, desperate.
“No”, said the Priest, looking at the sky, afraid. “We must pray!”, he shouted, angrily.
“Pray?”, the villagers muttered to themselves, confused. “To whom?”
“To the Gods”, the Priest whispered.
“But the Gods are dead”, the Chief protested.
“Who do you think killed them?”, the Priest demanded.
“Gods who were more powerful still. And it is to them we must pray”.
“New Gods! But who are they?”, the villagers asked one another, astonished, anxious, afraid.
“They will reveal themselves. But only if our prayers prove worthy. Come. Let us pray!”, the Priest commanded.
“We are saved!”, cried the villagers.
“Glory!”, cried the Chief.
The Priest smiled.
He raised his hands to the heavens; and they all bowed beneath the perfect sky the new Gods hid behind.
The sun rose high. There was not a cloud to be seen.
BLOGGER COMMENT: The “big ideas” of today are silly multimillion dollar phone apps and dumping ice cubes on your head in front of the new Audi.
I recall having many conversations like this in the Sixties. I’m happy there are those beginning to question the status quo in the 21st century. Perhaps one might begin by looking back to Aristotle, Plato, Socrates…
I’m enough of a distraction addict that a low-level ambient guilt about not getting my real work done hovers around me for most of the day. And this distractible quality in me pervades every part of my life. The distractions—What am I making for dinner?, Who was that woman in “Fargo”?, or, quite commonly, What else should I be reading?—are invariably things that can wait. What, I wonder, would I be capable of doing if I weren’t constantly worrying about what I ought to be doing?
And who is this frumpy thirty-something man who has tried to read “War and Peace” five times, never making it past the garden gate? I took the tome down from the shelf this morning and frowned again at those sad little dog-ears near the fifty-page mark.
Are the luxuries of time on which deep reading is reliant available to us anymore? Even the attention we deign to give to our distractions, those frissons, is narrowing.
It’s important to note this slippage. As a child, I would read for hours in bed without the possibility of a single digital interruption. Even the phone (which was anchored by wires to the kitchen wall downstairs) was generally mute after dinner. Our two hours of permitted television would come to an end, and I would seek out the solitary refuge of a novel. And deep reading (as opposed to reading a Tumblr feed) was a true refuge. What I liked best about that absorbing act was the fact books became a world unto themselves, one that I (an otherwise powerless kid) had some control over. There was a childish pleasure in holding the mysterious object in my hands; in preparing for the story’s finale by monitoring what Austen called a “tell-tale compression of the pages”; in proceeding through some perfect sequence of plot points that bested by far the awkward happenstance of real life.
The physical book, held, knowable, became a small mental apartment I could have dominion over, something that was alive because of my attention and then lived in me.
But now . . . that thankful retreat, where my child-self could become so lost, seems unavailable to me. Today there is no room in my house, no block in my city, where I am unreachable.
Eventually, if we start giving them a chance, moments of absence reappear, and we can pick them up if we like. One appeared this morning, when my partner flew to Paris. He’ll be gone for two weeks. I’ll miss him, but this is also my big break.
I’ve taken “War and Peace” back down off the shelf. It’s sitting beside my computer as I write these lines—accusatory as some attention-starved pet.
You and me, old friend. You, me, and two weeks. I open the book, I shut the book, and I open the book again. The ink swirls up at me. This is hard. Why is this so hard?
* * *
Dr. Douglas Gentile, a friendly professor at Iowa State University, recently commiserated with me about my pathetic attention span. “It’s me, too, of course,” he said. “When I try to write a paper, I can’t keep from checking my e-mail every five minutes. Even though I know it’s actually making me less productive.” This failing is especially worrying for Gentile because he happens to be one of the world’s leading authorities on the effects of media on the brains of the young. “I know, I know! I know all the research on multitasking. I can tell you absolutely that everyone who thinks they’re good at multitasking is wrong. We know that in fact it’s those who think they’re good at multitasking who are the least productive when they multitask.”
The brain itself is not, whatever we may like to believe, a multitasking device. And that is where our problem begins. Your brain does a certain amount of parallel processing in order to synthesize auditory and visual information into a single understanding of the world around you, but the brain’s attention is itself only a spotlight, capable of shining on one thing at a time. So the very word multitask is a misnomer. There is rapid-shifting minitasking, there is lame-spasms-of-effort-tasking, but there is, alas, no such thing as multitasking. “When we think we’re multitasking,” says Gentile, “we’re actually multiswitching.”
We can hardly blame ourselves for being enraptured by the promise of multitasking, though. Computers—like televisions before them—tap into a very basic brain function called an “orienting response.” Orienting responses served us well in the wilderness of our species’ early years. When the light changes in your peripheral vision, you must look at it because that could be the shadow of something that’s about to eat you. If a twig snaps behind you, ditto. Having evolved in an environment rife with danger and uncertainty, we are hardwired to always default to fast-paced shifts in focus. Orienting responses are the brain’s ever-armed alarm system and cannot be ignored.
Gentile believes it’s time for a renaissance in our understanding of mental health. To begin with, just as we can’t accept our body’s cravings for chocolate cake at face value, neither can we any longer afford to indulge the automatic desires our brains harbor for distraction.
* * *
It’s not merely difficult at first. It’s torture. I slump into the book, reread sentences, entire paragraphs. I get through two pages and then stop to check my e-mail—and down the rabbit hole I go. After all, one does not read “War and Peace” so much as suffer through it. It doesn’t help that the world at large, being so divorced from such pursuits, is often aggressive toward those who drop away into single-subject attention wells. People don’t like it when you read “War and Peace.” It’s too long, too boring, not worth the effort. And you’re elitist for trying.
In order to finish the thing in the two weeks I have allotted myself, I must read one hundred pages each day without fail. If something distracts me from my day’s reading—a friend in the hospital, a magazine assignment, sunshine—I must read two hundred pages on the following day. I’ve read at this pace before, in my university days, but that was years ago and I’ve been steadily down-training my brain ever since.
* * *
Another week has passed—my “War and Peace” struggle continues. I’ve realized now that the subject of my distraction is far more likely to be something I need to look at than something I need to do. There have always been activities—dishes, gardening, sex, shopping—that derail whatever purpose we’ve assigned to ourselves on a given day. What’s different now is the addition of so much content that we passively consume.
Only this morning I watched a boy break down crying on “X Factor,” then regain his courage and belt out a half-decent rendition of Beyoncé’s “Listen”; next I looked up the original Beyoncé video and played it twice while reading the first few paragraphs of a story about the humanity of child soldiers; then I switched to a Nina Simone playlist prepared for me by Songza, which played while I flipped through a slide show of American soldiers seeing their dogs for the first time in years; and so on, ad nauseam. Until I shook I out of this funk and tried to remember what I’d sat down to work on in the first place.
* * *
If I’m to break from our culture of distraction, I’m going to need practical advice, not just depressing statistics. To that end, I switch gears and decide to stop talking to scientists for a while; I need to talk to someone who deals with attention and productivity in the so-called real world. Someone with a big smile and tailored suits such as organizational guru Peter Bregman. He runs a global consulting firm that gets CEOs to unleash the potential of their workers, and he’s also the author of the acclaimed business book 18 Minutes, which counsels readers to take a minute out of every work hour (plus five minutes at the start and end of the day) to do nothing but set an intention.
Bregman told me he sets his watch to beep every hour as a reminder that it’s time to right his course again. Aside from the intention setting, Bregman counsels no more than three e-mail check-ins a day. This notion of batch processing was anathema to someone like me, used to checking my in-box so constantly, particularly when my work feels stuck. “It’s incredibly inefficient to switch back and forth,” said Bregman, echoing every scientist I’d spoken to on multitasking. “Besides, e-mail is, actually, just about the least efficient mode of conversation you can have. And what we know about multitasking is that, frankly, you can’t. You just derail.”
“I just always feel I’m missing something important,” I said. “And that’s precisely why we lose hours every day, that fear.” Bregman argues that it’s people who can get ahead of that fear who end up excelling in the business world that he spends his own days in. “I think everyone is more distractible today than we used to be. It’s a very hard thing to fix. And as people become more distracted, we know they’re actually doing less, getting less done. Your efforts just leak out. And those who aren’t—aren’t leaking—are going to be the most successful.”
I hate that I leak. But there’s a religious certainty required in order to devote yourself to one thing while cutting off the rest of the world. We don’t know that the inbox is emergency-free, we don’t know that the work we’re doing is the work we ought to be doing. But we can’t move forward in a sane way without having some faith in the moment we’ve committed to. “You need to decide that things don’t matter as much as you might think they matter,” Bregman suggested as I told him about my flitting ways. And that made me think there might be a connection between the responsibility-free days of my youth and that earlier self’s ability to concentrate. My young self had nowhere else to be, no permanent anxiety nagging at his conscience. Could I return to that sense of ease? Could I simply be where I was and not seek out a shifting plurality to fill up my time?
* * *
It happened softly and without my really noticing.
As I wore a deeper groove into the cushions of my sofa, so the book I was holding wore a groove into my (equally soft) mind. Moments of total absence began to take hold more often; I remembered what it was like to be lost entirely in a well-spun narrative. There was the scene where Anna Mikhailovna begs so pitifully for a little money, hoping to send her son to war properly dressed. And there were, increasingly, more like it. More moments where the world around me dropped away and I was properly absorbed. A “causeless springtime feeling of joy” overtakes Prince Andrei; a tearful Pierre sees in a comet his last shimmering hope; Emperor Napoleon takes his troops into the heart of Russia, oblivious to the coming winter that will destroy them all…
It takes a week or so for withdrawal symptoms to work through a heroin addict’s body. While I wouldn’t pretend to compare severity here, doubtless we need patience, too, when we deprive ourselves of the manic digital distractions we’ve grown addicted to.
That’s how it was with my Tolstoy and me. The periods without distraction grew longer, I settled into the sofa and couldn’t hear the phone, couldn’t hear the ghost-buzz of something else to do. I’m teaching myself to slip away from the world again.
* * *
Yesterday I fell asleep on the sofa with a few dozen pages of “War and Peace” to go. I could hear my cell phone buzzing from its perch on top of the piano. I saw the glowing green eye of my Cyclops modem as it broadcast potential distraction all around. But on I went past the turgid military campaigns and past the fretting of Russian princesses, until sleep finally claimed me and my head, exhausted, dreamed of nothing at all. This morning I finished the thing at last. The clean edges of its thirteen hundred pages have been ruffled down into a paper cabbage, the cover is pilled from the time I dropped it in the bath. Holding the thing aloft, trophy style, I notice the book is slightly larger than it was before I read it.
It’s only after the book is laid down, and I’ve quietly showered and shaved, that I realize I haven’t checked my e-mail today. The thought of that duty comes down on me like an anvil.
Instead, I lie back on the sofa and think some more about my favorite reader Milton – about his own anxieties around reading. By the mid-1650s, he had suffered that larger removal from the crowds, he had lost his vision entirely and could not read at all—at least not with his own eyes. From within this new solitude, he worried that he could no longer meet his potential. One sonnet, written shortly after the loss of his vision, begins:
When I consider how my light is spent,
Ere half my days, in this dark world and wide, and that one Talent
which is death to hide Lodged with me useless . . .
Yet from that position, in the greatest of caves, he began producing his greatest work. The epic “Paradise Lost,” a totemic feat of concentration, was dictated to aides, including his three daughters.
Milton already knew, after all, the great value in removing himself from the rush of the world, so perhaps those anxieties around his blindness never had a hope of dominating his mind. I, on the other hand, and all my peers, must make a constant study of concentration itself. I slot my ragged “War and Peace” back on the shelf. It left its marks on me the same way I left my marks on it (I feel awake as a man dragged across barnacles on the bottom of some ocean). I think: This is where I was most alive, most happy. How did I go from loving that absence to being tortured by it? How can I learn to love that absence again?
This essay is adapted from “The End of Absence” by Michael Harris, published by Current / Penguin Random House.
“Art and physics, like wave and particle, are an integrated duality … two different but complementary facets of a single description of the world.”
“It’s part of the nature of man,” Ray Bradbury told Carl Sagan and Arthur C. Clarke as they peered into the future of space exploration, “to start with romance and build to a reality.”“What would happen,”Marshall McLuhan wondered in his seminal 1964 treatise Understanding Media: The Extensions of Man, “if art were suddenly seen for what it is, namely, exact information of how to rearrange one’s psyche in order to anticipate the next blow from our own extended faculties?” More than a quarter century later, Leonard Shlain picked up the inquiry with added dimension in Art & Physics: Parallel Visions in Space, Time, and Light (public library) — an exploration of how “the inscrutability of modern art and the impenetrability of the new physics” intersect in a shared system of thinking about how the world works. In the preface, Shlain — neither an artist nor a physicist himself — considers how his training as a surgeon lends him a unique perspective on the two fields and their cross-pollination:
A surgeon is both an artist and a scientist… Surgeons rely heavily on their intuitive visual-spatial right-hemispheric mode. At the same time, our training is obviously scientific. Left-brained logic, reason, and abstract thinking are the stepping-stones leading to the vast scientific literature’s arcane tenets. The need in my profession to shuttle back and forth constantly between these two complementary functions of the human psyche has served me well for this project.
Shlain lays out the basic premise of the parallel between the two fields:
Art and physics are a strange coupling. Of the many human disciplines, could there be two that seem more divergent? The artist employs image and metaphor; the physicist uses number and equation. Art encompasses an imaginative realm of aesthetic qualities; physics exists in a world of crisply circumscribed mathematical relationships between quantifiable properties. Traditionally, art has created illusions meant to elicit emotion; physics has been an exact science that made sense…
Yet, despite what appear to be irreconcilable differences, there is one fundamental feature that solidly connects these disciplines. Revolutionary art and visionary physics are both investigations into the nature of reality. Roy Lichtenstein, the pop artist of the 1960s, declared, “Organized perception is what art is all about.” Sir Isaac Newton might have said as much for physics; he, too, was concerned with organizing perceptions. While their methods differ radically, artists and physicists share the desire to investigate the ways the interlocking pieces of reality fit together. This is the common ground upon which they meet.
Although the development of physics has always depended upon the incremental contributions of many original and dedicated workers, on a few occasions in history, one physicist has had an insight of such import that it led to a revision in his whole society’s concept of reality. . . .
Emile Zola’s definition of art: “Nature as seen through a temperament,” invokes physics, which is likewise involved with nature. The Greek word, physis, means “nature.” … The physicist, like any scientist, sets out to break “nature” down into its component parts to analyze the relationship of those parts. This process is principally one of reduction. The artist, on the other hand, often juxtaposes different features of reality and synthesizes them, so that upon completion, the whole work is greater than the sum of its parts. There is considerable crossover in the technique used by both. The novelist Vladimir Nabokov wrote, “There is no science without fancy and no art without facts.”
In addition to illuminating, imitating, and interpreting reality … artists create a language of symbols for things for which there are yet to be words.
This capacity for abstraction and symbolic representation, Shlain argues, is hard-wired into the evolution of our cognitive development:
Observe any infant as it masters its environment. Long before speech occurs, a baby develops an association between the image of a bottle and a feeling of satisfaction. Gradually, the baby accumulates a variety of images of bottles. This is an astounding feat considering that a bottle viewed from different angles changes shape dramatically: from a cylinder to an ellipse to a circle. Synthesizing these images, the child’s emerging conceptual faculties invent an abstract image that encompasses the idea of an entire group of objects she or he will henceforth recognize as bottles. This step in abstraction allows the infant to understand the idea of “bottleness.”
This rudimentary faculty remains central to how we make sense of the world as adults and how we grasp its immaterial subtleties:
Concepts such as “justice,” “freedom” or “economics” can be turned over in the mind without ever resorting to mental pictures. While there is never final resolution between word and image, we are a species dependent on the abstractions of language and in the main, the word eventually supplants the image.
When we reflect, ruminate, reminisce, muse and imagine, generally we revert to the visual mode. But in order to perform the brain’s highest function, abstract thinking, we abandon the use of images and are able to carry on without resorting to them. It is with great precision that we call this type of thinking, “abstract.” This is the majesty and the tyranny of language. To affix a name to something is the beginning of control over it. . . . Words, more than strength or speed, became the weapons that humans have used to subdue nature.
Children’s use of metaphor, we now know, sheds light on the evolution of human imagination — something Shlain argues is central to our ability to navigate the world. Adding to history’s most elegant definitions of art, he argues for the cultural role of the artist in fostering this crucial domain of understanding:
Because the erosion of images by words occurs at such an early age, we forget that in order to learn something radically new, we need first to imagine it. “Imagine” literally means to “make an image.” … [If] this function of imagination, so crucial to the development of an infant, is also present in the civilization at large, who then creates the new images that precede abstract ideas and descriptive language? It is the artist.
Art [lives] not only as an aesthetic that can be pleasing to the eye but, as a Distant Early Warning system of the collective thinking of a society. Visionary art alerts the other members that a conceptual shift is about to occur in the thought system used to perceive the world.
One of Lisbeth Zwerger’s imaginative illustrations for ‘Alice in Wonderland.’ Click image for more.
He cites art critic Robert Hughes’s assertion that “the truly significant work of art is the one that prepares the future” and adds:
Repeatedly throughout history, the artist introduces symbols and icons that in retrospect prove to have been an avant-garde for the thought patterns of a scientific age not yet born.
Revolutionary art in all times has served this function of preparing the future.
Shlain returns to the common ground between art and physics, both of which serve as tools for mapping the unknown:
Both art and physics are unique forms of language. Each has a specialized lexicon of symbols that is used in a distinctive syntax. Their very different and specific contexts obscure their connection to everyday language as well as to each other. Nevertheless, it is noteworthy just how often the terms of one can be applied to the concepts of the other… While physicists demonstrate that A equals B or that X is the same as Y, artists often choose signs, symbols and allegories to equate a painterly image with a feature of experience. Both of these techniques reveal previously hidden relationships.
Revolutionary art and visionary physics attempt to speak about matters that do not yet have words. That is why their languages are so poorly understood by people outside their fields. Because they both speak of what is certainly to come, however, it is incumbent upon us to learn to understand them.
Illustration from ‘Alice in Quantumland: An Allegory of Quantum Physics’ by CERN physicist Robert Gilmore. Click image for more.
Turning to the famous Tower of Babel myth — a Biblical story about humanity’s collaborative effort to build a tower that would reach the heavens, paralyzed by an indignant god’s spell that transformed people’s previously common language into garbled speech that made them unable to communicate and collaborate — Shlain draws a parallel to the artificial garbling of the shared language of art and physics:
History has been the record of our agonizingly slow resumption of work on this mythic public monument to knowledge. Gradually the parochial suspicions that had been abetted by large numbers of local dialects have given way to the more universal outlook of modern humankind. Currently, this work in progress is the creation of a global commonwealth. The worldwide community of artists and scientists is and has been in the forefront of this coalescence, offering perceptions of reality that erase linguistic and national boundaries. Reconciliation of the apparent differences between these two unique human languages, art and physics, is the next important step in developing our unifying Tower.
Both disciplines, he argues, first require us to ask how we know the world. Tracing the history of the answer from Plato to Descartes to Kant, Shlain points to philosophers’ distinction between “the inner eye of imagination and the external world of things” as a toxic and artificial divide that drove art and physics apart:
The faculty we use to grasp the nature of the “out there” is our imagination. Somewhere within the matrix of our brain we construct a separate reality created by a disembodied, thinking consciousness. This inner reality is unconnected to external space and exists outside the stream of linear time. When reminiscing about a day at the beach, we knit together elements of that day that no longer “actually” exist. We can run the events forward and backward with ease, and amend with alternate possibilities what we believe happened… Consciousness, resembling nothing so much as long columns of ants at work, must laboriously transfer the outside world piece by piece through the tunnels of the senses, then reconstruct it indoors. This inner spectral vision amounts to a mental “opinion” unique to each individual of how the world works… When an entire civilization reaches a consensus about how the world works, the belief system is elevated to the supreme status of a “paradigm,” whose premises appear to be so obviously certain no one has to prove them anymore.
Shlain points to the beginning of the 20th century, when Einstein’s theory of photons challenged two centuries of considering light a wave, as a turning point for the integration between art and physics. Suddenly, by acknowledging the contradictory duality of light as both a particle and a wave, science had to confront its basic tenet of objectivity and fixed laws. As Shlain puts it, “at the turn of the century, what was to be a surprising feature of quantum reality amounted to a Zen koan.”
Illustration by Vladimir Radunsky from ‘On a Beam of Light: A Story of Albert Einstein’ by Jennifer Berne. Click image for more.
In 1926, Niels Bohr formalized this notion in his theory of complementarity, which stated that light was not either a wave or a particle, but was both a wave and a particle. Shlain writes:
As it turned out, light would reveal only one aspect of its nature at a time, resembling an odd carnival peep show. Whenever a scientist set up an experiment to measure the wavelike aspect of light, the subjective act of deciding which measuring device to use in some mysterious way affected the outcome, and light responded by acting as a wave. The same phenomenon occurred whenever a scientist set out to measure the particlelike aspect of light. Thus “subjectivity,” the anathema of all science (and the creative wellspring of all art) had to be admitted into the carefully defended citadel of classical physics. Werner Heisenberg, Bohr’s close associate, said in support of this bizarre notion, “The common division of the world into subject and object, inner world and outer world, body and soul is no longer adequate…. Natural science does not simply describe and explain nature; it is part of the interplay between nature and ourselves.” According to the new physics, observer and observed are somehow connected, and the inner domain of subjective thought turns out to be intimately conjoined to the external sphere of objective facts.
From this revolutionary duality of light Shlain extracts a broader metaphor for his central thesis:
[Through] the complementarity of art and physics … these two fields intimately entwine to form a lattice upon which we all can climb a little higher in order to construct our view of reality. Understanding this connection should enhance our appreciation for the vitality of art and deepen our sense of awe before the ideas of modern physics. Art and physics, like wave and particle, are an integrated duality: They are simply two different but complementary facets of a single description of the world. Integrating art and physics will kindle a more synthesized awareness which begins in wonder and ends with wisdom.
In the remainder of Art & Physics, a mind-expanding read in its totality, Shlain goes on to trace the evolution of human thinking and knowledge from Ancient Greece to the Renaissance to the 20th century, exploring various aspects of the parallels between the two disciplines, from Einstein and Picasso’s “common vision” to the interplay between illusion and reality to how music integrates the reason of science with the emotional expressiveness of art. Complement it with Dorion Sagan (son of Carl) on how science and philosophy enrich each other.