Ten Years After Hunter S. Thompson’s Death, the Debate Over Suicide Rages On

ten-years-after-hunter-s-thompsons-death-the-debate-over-suicide-rages-on-220-1424463839-crop_lede

February 20, 2015

Today, February 20, marks the tenth anniversary of Hunter S. Thompson killing himself with a .45-caliber handgun in his home in Woody Creek, Colorado. Since his suicide, the right-to-die movement has gained a stronger foothold in American consciousness—even if the country is just as divided as ever on whether doctors should be assisting patients in ending their own lives.

“Poling has always shown a majority of people believing that someone has a moral right to commit suicide under some circumstances, but that majority has been increasing over time,” says Matthew Wynia, Director of Center for Bioethics and Humanities at the University of Colorado Anschutz Medical Campus. Wynia believes a chief factor in that change has been “more and more people say they’ve given a good deal of thought on this issue. And the more people tend to give thought to this issue, the more likely they are to say they are in favor of people having a moral right to commit suicide, under certain circumstances.”

The sticking point is what constitutes a justifiable reason to kill yourself or have a doctor do so for you. In Thompson’s case, he was suffering from intense physical discomfortdue to a back injury, broken leg, hip replacement surgery, and a lung infection. But his widow, Anita, says that while the injuries were significant, they did not justify his suicide.

“His pain was unbearable at times, but was by no means terminal,” Anita tells me via email. “That is the rub. If it were a terminal illness, the horrible aftermath would have been different for me and his loved ones. None of us minded caring for him.”

A mix of popular culture and legislative initiatives have shifted the terrain since then. When Thompson made his big exit in 2005, Jack Kevorkian was still incarcerated for helping his patients shuffle off their mortal coil. He was released in 2007, and shortly before his death a few years later, HBO chronicled his struggles to change public opinion of physician-assisted suicide in the film You Don’t Know Jack, starring Al Pacino.

Last year, suicide seemed to cross a threshold of legitimacy in America. When terminally ill 29-year-old Brittany Maynard appeared on the cover of People magazine next to the headline, “My Decision to Die,” the issue was thrust into the faces of every supermarket shopper in the US. Earlier in the year, the season finale of Girls closed with one of the main characters agreeing to help her geriatric employer end her life, only to have the woman back out after swallowing a fistfull of pills, shouting, “I don’t want to die!”

After the self-inflicted death of Robin Williams last summer, those with strong moral opposition to suicide used the tragedy as an illustration of how much taking your life hurts those around you. “I simply cannot understand how any parent could kill themselves,” Henry Rollins wrote in an editorial for LA Weekly. “I don’t care how well adjusted your kid might be—choosing to kill yourself, rather than to be there for that child, is every shade of awful, traumatic and confusing. I think as soon as you have children, you waive your right to take your own life… I no longer take this person seriously. Their life wasn’t cut short—it was purposely abandoned.”

A decade earlier, Rollins’s comments might have gone unnoticed. As might have Fox News’ Shepard Smith when he referred to Williams as “such a coward” for abandoning his children. Of course, both received a good lashing in the court of public opinion for being so dismissive toward someone suffering from depression. “To the core of my being, I regret it,” Smith apologized in a statement. Rollins followed suit, saying, “I should have known better, but I obviously did not.”

A 2013 Pew Research Poll found that 38 percent of Americans believed that a person has a moral right to commit suicide if “living has become a burden.” But if the person is described as “suffering great pain and have no hope of improvement,” the number increased to 62 percent, a seven-point jump from the way Americans felt about the issue in 1990.

“Psychic suffering is as important as physical suffering when determining if a person should have help to die.”

Still, only 47 percent of Americans in a Pew poll last October said that a doctor should be allowed to facilitate a suicide, barely different from numbers at the time of Thompson’s death. Wynia believes an enduring factor here this is the public’s fear that assisted suicide will be applied as a cost-cutting measure to an already overburdened healthcare system.

“There is worry that insurance companies will cover medication to end your life, but they won’t cover treatments that allow you to extend your life,” he says. “And then the family is stuck with either ponying up the money to extend that person’s life, or they could commit suicide. That puts a lot of pressure on both the family and the individual. Also, there is the issue of the doctor being seen as a double agent who isn’t solely looking out for their best interest.”

As with abortion before Roe v. Wade, when determined citizens are denied medical assistance and left to their own devices, the results can sometimes be disastrous. “There are people who try and fail at suicide, and sometimes they end up in much worse positions than they started,” Wynia adds. “I’ve cared for someone who tried to commit suicide by drinking Drano; that’s a good way to burn out your entire esophagus, and if you survive it, you’re in very bad shape afterward.”

A 2014 Gallup poll showed considerably more support for doctors’ involvement in ending a patient’s life. When asked if physicians should be allowed to “legally end a patient’s life by some painless means,” 69 percent of Americans said they were in favor of such a procedure. But when the question is whether physicians should be able to “assist the patient to commit suicide,” support dropped to 58 percent. This has lead many advocacy groups to adopt the term “aid in dying” as opposed to “assisted suicide.”

A statement on the Compassion and Choices website states: “It is wrong to equate ‘suicide,’ which about 30,000 Americans, suffering from mental illness, tragically resort to each year, with the death-with-dignity option utilized by only 160 terminally ill, but mentally competent, patients in Oregon and Washington last year.”

According to Oregon’s Death With Dignity Act—which permitted Brittany Maynard to be prescribed a lethal dose of drugs from her physician—a patient must be over 18 years old, of sound mind, and diagnosed with a terminal illness with less than six months to live in order to be given life-ending care. Currently, four other states have bills similar to Oregon’s, while 39 states have laws banning physician-assisted suicide. Earlier this month, legislators in Colorado attempted to pass their own version of an assisted suicide bill, but it failed in committee.

In 1995, Australia’s Northern Territory briefly legalized euthanasia through the Rights of the Terminally Ill Act. Dr. Philip Nitschke was the first doctor to administer a voluntary lethal injection to a patient, followed by three more before the law was overturned by the Australian Parliament in 1997. Nitschke retired from medicine that year and began working to educate the public on how to administer their own life-ending procedure without medical supervision or assistance. Last summer, the Australian Medical Board suspended his medical registration, a decision which he is appealing.

Nitschke says two states in Australia currently offer life in prison as a penalty for anyone assisting in another’s suicide, and that he’s been contacted by the British police, who say he may be in violation of the United Kingdom’s assisted suicide laws for hosting workshops educating Brits on how to kill themselves. Unlike more moderate groups like Compassion and Choices, Nitschke’s Exit International doesn’t shy away from words like “suicide,” and feels that the right to die should be expanded dramatically.

A proponent of both left-wing social justice and right-wing rhetoric about personal freedoms, Thompson had very strong feelings about the role of government in our daily lives, particularly when it came to what we were allowed to do with our own bodies.

Laws in most countries that allow physician-assisted suicide under specific circumstances do not consider psychological ailments like depression a justifiable reason for ending your life. Nitschke sees a circular hypocrisy in this, arguing that everyone should be granted the right to end their own life regardless of health, and that those suffering a mental illness are still able to give informed consent.

“Psychic suffering is as important as physical suffering when determining if a person should have help to die,” Nitschke tells me. “The prevailing medical board [in Australia] views almost any psychiatric illness as a reason why one cannot give consent—but the catch-22 is that anyone contemplating suicide, for whatever reason, must be suffering psychiatric illness.”

These days, Nitschke is avoiding criminal prosecution by merely providing information on effective suicide techniques. So long as he doesn’t physically administer a death agent to anyone—the crime that resulted in Kevorkian being hit with a second-degree murder conviction and eight years in prison—he’ll most likely steer clear of jail time.

Philip Nitschke’s euthanasia machine. Photo via Wikimedia Commons

“I think our society is very confused about liberty,” Andrew Solomon, author of The Noonday Demon: An Atlas of Depression, wrote in 2012. “I don’t think it makes sense to force women to carry children they don’t want, and I don’t think it makes sense to prevent people who wish to die from doing so. Just as my marrying my husband doesn’t damage the marriages of straight people, so people who end their lives with assistance do not threaten the lives or decisions of other people.”

While support for laws banning physician-assisted suicide typically come from conservative religious groups and those mistrustful of government-run healthcare, the idea that the government has a role in deciding your end of life care is rooted in a left-leaning philosophy.

“The theory used to be that the state has an interest in the health and wellbeing of its citizens,” acccording to Wynia, “and therefore you as a citizen do not have a right to kill yourself, because you are, in essence, a property of the state.”

This conflicted greatly with the philosophy of Hunter S. Thompson. A proponent of both left-wing social justice and right-wing rhetoric about personal freedoms, Thompson had very strong feelings about the role of government in our daily lives, particularly when it came to what we were allowed to do with our own bodies.

“He once said to me, ‘I’d feel real trapped in this life, Ralph, if I didn’t know I could commit suicide at any moment,'” remembered friend and longtime collaborator Ralph Steadman in a recent interview with Esquire.

Sitting in a New York hotel room while writing the introduction to The Great Shark Hunt, a collection of his essays and journalism published in 1979, Thompson described feeling an existential angst when reflecting on the body of work. “I feel like I might as well be sitting up here carving the words for my own tombstone… and when I finish, the only fitting exit will be right straight off this fucking terrace and into The Fountain, 28 stories below and at least 200 yards out into the air and across Fifth Avenue… The only way I can deal with this eerie situation at all is to make conscious decision that I have already lived and finished the life I planned to live—(13 years longer, in fact).”

Thompson’s widow, Anita, was on the phone with her husband when he took his life. To this day, she feels that the situation was far from hopeless, that his injuries weren’t beyond repair, and that he still had plenty of years left in him.

“He was about to have back surgery again, which meant that the problem would soon be fixed and he could commence his recovery,” she tells me. “My belief is that supporting somebody’s ‘freedom’ to commit suicide because he or she is in some pain or depressed is much different than a chronic or terminal illness. Although I’ve healed from the tragedy, the fact that his personal decision was actually hurried by a series of events and people that later admitted they supported his decision, still haunts me today.”

In September 2005, Rolling Stone published what has come to be known as Hunter Thompson’s suicide note. Despite being written four days beforehand, the brief message does contain the weighty despair of a man unable to inspire in himself the will to go on:

No More Games. No More Bombs. No More Walking. No More Fun. No More Swimming. 67. That is 17 years past 50. 17 more than I needed or wanted. Boring. I am always bitchy. No Fun — for anybody. 67. You are getting Greedy. Act your old age. Relax—This won’t hurt.

Seeing as he lived his life as an undefinable political anomaly—he was an icon of the the hedonism of the 60s and 70s, and also a card-carrying member of the NRA—it’s only fitting that Thompson’s exit from this earth was through the most divisive and controversial doorway possible.

“The fundamental beliefs that underlie our nation are sometimes in conflict with each other—and these issues get at some of the basic tensions in what we value as Americans,” says Wynia. “We value our individual liberties, we value our right to make decisions for ourselves, but we also are a religious community, and we are mistrustful of authority. When you talk about giving the power to doctors or anyone else to help you commit suicide, it makes a lot of people nervous. Even though we also have a libertarian streak that believes, ‘I should be allowed to do this, and I should be allowed to ask my doctor to help me.’ I think this is bound to be a contentious issue for some time to come.”

If you are feeling hopeless of suicidal, there are people you can talk to. Please call the Suicide Prevention Lifeline at 1-800-273-8255.

Follow Josiah M. Hesse on Twitter.

 

http://www.vice.com/read/ten-years-after-hunter-s-thompsons-death-the-debate-over-suicide-rages-on-220?utm_source=vicefbus

The Island of Knowledge: How to Live with Mystery in a Culture Obsessed with Certainty and Definitive Answers

by

“We strive toward knowledge, always more knowledge, but must understand that we are, and will remain, surrounded by mystery.”

“Our human definition of ‘everything’ gives us, at best, a tiny penlight to help us with our wanderings,” Benjamen Walker offered in an episode of his excellent Theory of Everythingpodcast as we shared a conversation about illumination and the art of discovery. Thirty years earlier, Carl Sagan had captured this idea in his masterwork Varieties of Scientific Experience, where he asserted: “If we ever reach the point where we think we thoroughly understand who we are and where we came from, we will have failed.” This must be what Rilke, too, had at heart when he exhorted us to live the questions. And yet if there is one common denominator across the entire history of human culture, it is the insatiable hunger to know the unknowable — that is, to know everything, and to know it with certainty, which is itself the enemy of the human spirit.

The perplexities and paradoxes of that quintessential human longing, and how the progress of modern science has compounded it, is what astrophysicist and philosopher Marcelo Gleiser examines in The Island of Knowledge: The Limits of Science and the Search for Meaning (public library).

Partway between Hannah Arendt’s timeless manifesto for the unanswerable questions at the heart of meaning and Stuart Firestein’s case for how not-knowing drives science, Gleiser explores our commitment to knowledge and our parallel flirtation with the mystery of the unknown.

Artwork from ‘Fail Safe,’ Debbie Millman’s illustrated-essay-turned-commencement address on courage and the creative life. Click image to read/listen.

What emerges is at once a celebration of human achievement and a gentle reminder that the appropriate reaction to scientific and technological progress is not arrogance over the knowledge conquered, which seems to be our civilizational modus operandi, but humility in the face of what remains to be known and, perhaps above all, what may always remain unknowable.

Gleiser begins by posing the question of whether there are fundamental limits to how much of the universe and our place in it science can explain, with a concrete focus on physical reality. Echoing cognitive scientist Alexandra Horowitz’s eye-opening exploration of why our minds miss the vast majority of what is going on around us, he writes:

What we see of the world is only a sliver of what’s “out there.” There is much that is invisible to the eye, even when we augment our sensorial perception with telescopes, microscopes, and other tools of exploration. Like our senses, every instrument has a range. Because much of Nature remains hidden from us, our view of the world is based only on the fraction of reality that we can measure and analyze. Science, as our narrative describing what we see and what we conjecture exists in the natural world, is thus necessarily limited, telling only part of the story… We strive toward knowledge, always more knowledge, but must understand that we are, and will remain, surrounded by mystery… It is the flirting with this mystery, the urge to go beyond the boundaries of the known, that feeds our creative impulse, that makes us want to know more.

A 1573 painting by Portuguese artist, historian, and philosopher Francisco de Holanda, a student of Michelangelo’s, from Michael Benson’s book ‘Cosmigraphics’—a visual history of understanding the universe. Click image for more.

In a sentiment that bridges Philip K. Dick’s formulation of reality as “that which, when you stop believing in it, doesn’t go away” with Richard Feynman’s iconicmonologue on knowledge and mystery, Gleiser adds:

The map of what we call reality is an ever-shifting mosaic of ideas.

[…]

The incompleteness of knowledge and the limits of our scientific worldview only add to the richness of our search for meaning, as they align science with our human fallibility and aspirations.

Gleiser notes that while modern science has made tremendous strides in illuminating the neuronal infrastructure of the brain, it has in the process reduced the mind to mere chemical operations, not only failing to advance but perhaps even impoverishing our understanding and sense of being. He admonishes against mistaking measurement for meaning:

There is no such thing as an exact measurement. Every measurement must be stated within its precision and quoted together with “error bars” estimating the magnitude of errors. High-precision measurements are simply measurements with small error bars or high confidence levels; there are no perfect, zero-error measurements.

[…]

Technology limits how deeply experiments can probe into physical reality. That is to say, machines determine what we can measure and thus what scientists can learn about the Universe and ourselves. Being human inventions, machines depend on our creativity and available resources. When successful, they measure with ever-higher accuracy and on occasion may also reveal the unexpected.

[…]

But the essence of empirical science is that Nature always has the last word… It then follows that if we only have limited access to Nature through our tools and, more subtly, through our restricted methods of investigation, our knowledge of the natural world is necessarily limited.

And yet even though much of the world remains invisible to us at any given moment, Gleiser argues that this is what the human imagination thrives on. At the same time, however, the very instruments that we create with this restless imagination begin to shape what is perceivable, and thus what is known, marking “reality” a Rube Goldberg machine of detectable measurements. Gleiser writes:

If large portions of the world remain unseen or inaccessible to us, we must consider the meaning of the word “reality” with great care. We must consider whether there is such a thing as an “ultimate reality” out there — the final substrate of all there is — and, if so, whether we can ever hope to grasp it in its totality.

[…]

Our perception of what is real evolves with the instruments we use to probe Nature. Gradually, some of what was unknown becomes known. For this reason, what we call “reality” is always changing… The version of reality we might call “true” at one time will not remain true at another.

[…]

As long as technology advances — and there is no reason to suppose that it will ever stop advancing for as long as we are around — we cannot foresee an end to this quest. The ultimate truth is elusive, a phantom.

Artwork by Marian Bantjes from ‘Beyond Pretty Pictures.’ Click image for more.

To illustrate this notion, Gleiser constructs the metaphor after which his book is titled — he paints knowledge as an island surrounded by the vast ocean of the unknown; as we learn more, the island expands into the ocean, its coastline marking the ever-shifting boundary between the known and the unknown. Paraphrasing the Socratic paradox, Gleiser writes:

Learning more about the world doesn’t lead to a point closer to a final destination — whose existence is nothing but a hopeful assumption anyway — but to more questions and mysteries. The more we know, the more exposed we are to our ignorance, and the more we know to ask.

Echoing Ray Bradbury’s poetic conviction that it’s part of human nature “to start with romance and build to a reality,” Gleiser adds:

This realization should open doors, not close them, since it makes the search for knowledge an open-ended pursuit, an endless romance with the unknown.

Gleiser admonishes against the limiting notion that we only have two options — staunch scientism, with its blind faith in science’s ability to permanently solve the mysteries of the unknown, and religious obscurantism, with its superstitious avoidance of inconvenient facts. Instead, he offers a third approach “based on how an understanding of the way we probe reality can be a source of endless inspiration without the need for setting final goals or promises of eternal truths.” In an assertion that invokes Sagan’s famous case for the vital balance between skepticism and openness, Gleiser writes:

This unsettled existence is the very blood of science. Science needs to fail to move forward. Theories need to break down; their limits need to be exposed. As tools probe deeper into Nature, they expose the cracks of old theories and allow new ones to emerge. However, we should not be fooled into believing that this process has an end.

I recently tussled with another facet of this issue — the umwelt of the unanswerable — in contemplating the future of machines that think for John Brockman’s annual Edge question. But what makes Gleiser’s point particularly gladdening is the underlying implication that despite its pursuit of answers, science thrives on uncertainty and thus necessitates an element of unflinching faith — faith in the process of the pursuit rather than the outcome, but faith nonetheless. And while the difference between science and religion might be, as Krista Tippett elegantly offered, in the questions they ask rather than the answers they offer, Gleiser suggests that both the fault line and the common ground between the two is a matter of how each relates to mystery:

Can we make sense of the world without belief? This is a central question behind the science and faith dichotomy… Religious myths attempt to explain the unknown with the unknowable while science attempts to explain the unknown with the knowable.

[…]

Both the scientist and the faithful believe in unexplained causation, that is, in things happening for unknown reasons, even if the nature of the cause is completely different for each. In the sciences, this belief is most obvious when there is an attempt to extrapolate a theory or model beyond its tested limits, as in “gravity works the same way across the entire Universe,” or “the theory of evolution by natural selection applies to all forms of life, including extraterrestrial ones.” These extrapolations are crucial to advance knowledge into unexplored territory. The scientist feels justified in doing so, given the accumulated power of her theories to explain so much of the world. We can even say, with slight impropriety, that her faith is empirically validated.

A 1617 depiction of the notion of non-space, long before the concept of vacuum existed, found in Michael Benson’s book ‘Cosmigraphics’—a visual history of understanding the universe. Click image for more.

Citing Newton and Einstein as prime examples of scientists who used wholly intuitive faith to advance their empirical and theoretical breakthroughs — one by extrapolating from his gravitational findings to assert that the universe is infinite and the other by inventing the notion of a “universal constant” to discuss the finitude of space — Gleiser adds:

To go beyond the known, both Newton and Einstein had to take intellectual risks, making assumptions based on intuition and personal prejudice. That they did so, knowing that their speculative theories were necessarily faulty and limited, illustrates the power of belief in the creative process of two of the greatest scientists of all time. To a greater or lesser extent, every person engaged in the advancement of knowledge does the same.

The Island of Knowledge is an illuminating read in its totality — Gleiser goes on to explore how conceptual leaps and bounds have shaped our search for meaning, what quantum mechanics reveal about the nature of physical reality, and how the evolution of machines and mathematics might affect our ideas about the limits of knowledge.

For a fine complement, see Hannah Arendt on thinking vs. knowing and the crucial difference between truth and meaning and astrophysicist Janna Levin onwhether the universe is infinite or finite, then treat yourself to Gleiser’s magnificent conversation with novelist Marilynne Robinson — herself a thinker of perceptive and nuanced insight on mystery — on the existentially indispensable On Being:

GLEISER: To think of science as separate from spirituality to me is a big mistake… There is nothing that says that science should be dispassionate about the spirit or the life of the spirit. And to me it’s quite the opposite. It’s exactly because I feel very spiritually connected with nature that I am a scientist. And to write equations on a blackboard and to come up with models about how nature works is, in a sense, a form of worship of that spirituality.

[…]

ROBINSON: One of the things that is fascinating is that we don’t know who we are. Human beings in acting out history describe themselves and every new epic is a new description of what human beings are. Every life is a new description of what human beings are. Every work of science, every object of art is new information. And it is inconceivable at this point that we could say anything final about what the human mind is, because it is demonstrating … in beautiful ways and terrifying ways, that it will surprise us over and over and over again.

 

http://www.brainpickings.org/2015/02/02/the-island-of-knowledge-marcelo-glasier/?mc_cid=9300b7fb0a&mc_eid=e0931c81b0

Why There Is No Massive Antiwar Movement in America

vietnam-protest-monument-ap691115062-af53ebb8d60c3b1de5272373ac6eddaa6617b84f-s6-c30

What’s missing is any sense of connection to the government, or that we the people matter.

Well, it’s one, two, three, look at that amputee,
At least it’s below the knee,
Could have been worse, you see.
Well, it’s true your kids look at you differently,
But you came in an ambulance instead of a hearse,
That’s the phrase of the trade,
It could have been worse.

— First verse of a Vietnam-era song written by U.S. Air Force medic Bob Boardman off Country Joe McDonald’s “I-Feel-Like-I’m-Fixin’-to-Die Rag”

There was the old American lefty paper, the Guardian, and the Village Voice, which beat the Sixties into the world, and its later imitators like the Boston Phoenix. There was Liberation News Service, the Rat in New York, the Great Speckled Bird in Atlanta, the Old Mole in Boston, the distinctly psychedelic Chicago SeedLeviathanViet-Report, and the L.A. Free Press, as well as that Texas paper whose name I long ago forgot that was partial to armadillo cartoons. And they existed, in the 1960s and early 1970s, amid a jostling crowd of hundreds of “underground” newspapers — all quite aboveground but the word sounded so romantic in that political moment.  There were G.I. antiwar papers by the score and high school rags by the hundreds in an “alternate” universe of opposition that somehow made the rounds by mail or got passed on hand-to-hand in a now almost unimaginable world of interpersonal social networking that preceded the Internet by decades. And then, of course, there was I.F. Stone’s Weekly (1953-1971): one dedicated journalist, 19 years, every word his own (except, of course, for the endless foolishness he mined from the reams of official documentation produced in Washington, Vietnam, and elsewhere).

I can remember the arrival of that newsletter, though I no longer know whether I subscribed myself or simply shared someone else’s copy. In a time when being young was supposed to be glorious, Stone was old — my parents’ age — but still we waited on his words. It helped to have someone from a previous generation confirm in nuts and bolts ways that the issue that swept so many of us away, the Vietnam War, was indeed an American atrocity.

The Call to Service

They say you can’t go home again, but recently, almost 44 years after I saw my last issue of theWeekly — Stone was 64 when he closed up shop; I was 27 — I found the full archive of them, all 19 years, online, and began reading him all over again. It brought back a dizzying time in which we felt “liberated” from so much that we had been brought up to believe and — though we wouldn’t have understood it that way then — angered and forlorn by the loss as well. That included the John Wayne version of America in which, at the end of any war film, as the Marine Corps Hymn welled up, American troops advanced to a justified victory that would make the world a better place. It also included a far kinder-hearted but allied vision of a country, a government, that was truly ours, and to which we owed — and one dreamed of offering — some form of service.  That was deeply engrained in us, which was why when, in his inaugural address, President John F. Kennedy so famously called on us to serve, the response was so powerful. (“And so, my fellow Americans, ask not what your country can do for you; ask what you can do for your country.”) Soon after, my future wife went into the Peace Corps like tens of thousands of other young Americans, while I dreamed, as I had from childhood, of becoming a diplomat in order to represent our country abroad.

And that sense of service to country ran so deep that when the first oppositional movements of the era arose, particularly the Civil Rights Movement, the impulse to serve was essential to them, as it clearly was to I.F. Stone. The discovery that under your country’s shining veneer lay a series of nightmares might have changed how that sense of obligation was applied, but it didn’t change the impulse. Not at all.

In his writing, Stone was calm, civil, thoughtful, fact-based, and still presented an American world that looked shockingly unlike the one you could read about in what wasn’t yet called “the mainstream media” or could see on the nightly network news. (Your TV still had only 13 channels, without a zapper in sight.) A researcher par excellence, Stone, like the rest of us, lacked the ability to see into the future, which meant that some of his fears (“World War III”) as well as his dreams never came true.  But on the American present of that time, he was remarkably on target. Rereading some of his work so many decades later set me thinking about the similarities and differences between that moment of eternal war in Indochina and the present endless war on terror.

Among the eeriest things about reading Stone’s Vietnam, Laos, and Cambodia coverage, 14 years into the next century, is how resonantly familiar so much of what he wrote still seems, how twenty-first-century it all is.  It turns out that the national security state hasn’t just been repeating things they’ve doneunsuccessfully for the last 13 years, but for the last 60.  Let me offer just a few examples from his newsletter.  I think you’ll get the idea.

* With last June’s collapse of the American-trained and -armed Iraqi army and recent revelations about its 50,000 “ghost soldiers” in mind, here’s Stone on the Laotian army in January 1961: “It is the highest paid army in Asia and variously estimated (the canny Laotians have never let us know the exact numbers, perhaps lest we check on how much the military payroll is diverted into the pockets of a few leaders) at from 23,000 to 30,000.  Yet it has never been able to stand up against handfuls of guerrillas and even a few determined battalions like those mustered by Captain Kong Le.”

* On ISIS’s offensive in Iraq last year, or the 9/11 attacks, or just about any other development you want to mention in our wars since then, our gargantuan bureaucracy of 17 expanding intelligence outfits has repeatedly been caught short, so consider Stone’s comments on the Tet Offensive of February 1968.  At a time when America’s top commander in Vietnam had repeatedly assured Americans that the Vietnamese enemy was losing, the North Vietnamese and the National Liberation Front (the “Vietcong”) launched attacks on just about every major town and city in South Vietnam, including the U.S. Embassy in Saigon: “We still don’t know what hit us.  The debris is not all in Saigon and Hue.  The world’s biggest intelligence apparatus was caught by surprise.”

* On our drone assassination and other air campaigns as a global war not on, but for — i.e., to recruit — terrorists, including our present bombing campaignsin Iraq and Syria, here’s Stone in February 1968: “When the bodies are really counted, it will be seen that one of the major casualties was our delusion about victory by air power: all that boom-boom did not keep the enemy from showing up at Langvei with tanks… The whole country is slowly being burnt down to ‘save it.’  To apply scorched-earth tactics to one’s own country is heroic; to apply it to a country one claims to be saving is brutal and cowardly… It is we who rally the people to the other side.” And here he is again in May 1970: “Nowhere has air power, however overwhelming and unchallenged, been able to win a war.”

Demobilizing Americans

And so it goes reading Stone today.  But if much in the American way of war remains dismally familiar some five decades later, one thing of major significance has changed, something you can see regularly in I.F. Stone’s Weekly but not in our present world.  Thirteen years after our set of disastrous wars started, where is the massive antiwar movement, including an army in near revolt and a Congress with significant critics in significant positions?

Think of it this way: in 1968, the head of the Senate Foreign Relations Committee was J. William Fulbright, a man who came to oppose U.S. policy in Vietnam and wrote a book about this country titled The Arrogance of Power (a phrase no senator who hoped to stay in Washington in 2015 would apply to the U.S.).  The head of the Senate Armed Services Committee today: John McCain.  ‘Nuff said.

In the last six decades, the American national security state has succeeded strikingly at only one thing (other than turning itself into a growth industry): it freed itself of us and of Congress.  In the years following the Vietnam War, the American people were effectively demobilized, shorn of that sense of service to country, while war was privatized and the citizen soldier replaced by an “all-volunteer” force and a host of paid contractors working for warrior corporations.  Post-9/11, the citizenry was urged to pay as much attention as possible to “our troops,” or “warriors,” and next to none to the wars they were fighting.  Today, the official role of a national security state, bigger and more powerful than in the Vietnam era, is to make Americans “safe” from terror.  In a world of war-making that has disappeared into the shadows and a Washington in which just about all information is now classified and shrouded in secrecy, the only way to be “safe” and “secure” as a citizen is, by definition, to be ignorant, to know as little as possible about what “our” government is doing in our name.  This helps explain why, in the Obama years, the only crime in official Washington is leaking orwhistleblowing; that is, letting the public in on something that we, the people, aren’t supposed to know about the workings of “our” government.

In 1973, President Richard Nixon ended the draft, a move meant to bring arebellious citizen’s army under control.  Since then, in a host of ways, our leaders have managed to sideline the citizenry, replacing the urge to serve with a sense of cynicism about government (which has morphed into many things, including, on the right, the Tea Party movement).  As a result, those leaders have been freed from us and from just about all congressional oversight and so have been able to do what they damn well pleased.  In practice, this has meant doing the same dumb, brutal, militarized things over and over again.  From the repetitive stupidity of twenty-first-century American foreign — that is, war — policy, you might draw the conclusion (though they won’t) that the citizenry, even in revolt, has something crucial to teach the state.

Serving the Country in Opposition

Nonetheless, this demobilization of us should be seen for what it is: a remarkable achievement.  It means that you have to be of a certain age (call me “I.F. Pebble”) even to remember what that urge to serve felt like, especially once it went into opposition on a massive scale. I.F. Stone was an early model for just that.  In those years, I was, too, and there was nothing special about me.  Untold numbers of Americans like me, military and civilian, engaged in such acts and thought of them as service to country.  Though they obviously didn’t fit the normal definition of American “patriotism,” they came from the same place.

In April 1968, not so many months after the Tet Offensive, I went with two close friends to a rally on Boston Common organized by an anti-draft group called the Resistance.  There, the three of us turned in our draft cards.  I went in jacket and tie because I wanted to make the point that we weren’t hippy radicals.  We were serious Americans turning our backs on a war from hell being pursued by a country transforming itself before our eyes into our worst nightmare.

Even all these years later, I can still remember the remarkable sense of exhilaration, even freedom, involved and also the fear.  (In those years, being a relatively meek and law-abiding guy, I often found myself beyond my comfort zone, and so a little — or more than a little — scared.)  Similarly, the next year, a gutsy young woman who was a co-worker and I — I had, by then, dropped out of graduate school and was working at an “underground” movement print shop — drove two unnerved and unnerving Green Beret deserters to Canada.  Admittedly, when they began pretend-machine-gunning the countryside we were passing through, I was unsettled, and when they pulled out dope (no drugs had been the agreed-upon rule on a trip in which we were to cross the Canadian border), I was ready to be anywhere else but in that car.  Still, whatever my anxieties, I had no doubt about why I was doing what I was doing, or about the importance of helping American soldiers who no longer wanted to take part in a terrible war.

Finally, in 1971, an Air Force medic named Bob Boardman, angered by the stream of American war wounded coming home, snuck me into his medical unit at Travis Air Force Base in northern California.  There, though without any experience as a reporter, I “interviewed” a bunch of wigged-out, angry guys with stumps for arms or legs, who were “antiwar” in all sorts of complex, unexpected, and outraged ways.  It couldn’t have been grimmer or more searing or more moving, and I went home, wrote up a three-part series on what I had seen and heard, and sold it to Pacific News Service, a small antiwar outfit in San Francisco (where I would subsequently go to work).

None of this would have been most Americans’ idea of service, even then.  But it was mine.  I felt that my government had betrayed me, and that it was my duty as a citizen to do whatever I could to change its ways (as, in fact, I still do).  And so, in some upside-down, inside-out way, I maintained a connection to and a perverse faith in that government, or our ability to force change on it, as the Civil Rights Movement had done.

That, I suspect, is what’s gone missing in much of our American world and just bringing back the draft, often suggested as one answer to our war-making problems, would be no ultimate solution.  It would undoubtedly change the make-up of the U.S. military somewhat.  However, what’s missing in action isn’t the draft, but a faith in the idea of service to country, the essence of what once would have been defined as patriotism.  At an even more basic level, what may be gone is the very idea of the active citizen, not to speak of the democracy that went with such a conception of citizenship, as opposed to our present bizarro world of multi-billion-dollar 1% elections.

If, so many years into the disastrous war on terror, the Afghan War that never ends, and most recently Iraq War 3.0 and Syria War 1.0, there is no significant antiwar movement in this country, you can thank the only fit of brilliance the national security state has displayed.  It successfully drummed us out of service.  The sole task it left to Americans, 40 years after the Vietnam War ended, was the ludicrous one of repeatedly thanking the troops for theirservice, something that would have been inconceivable in the 1950s or 1960s because you would, in essence, have been thanking yourself.

Missing in Action

Here are I.F. Stone’s last words from the penultimate paragraph of the final issue of his newsletter:

“No one could have been happier than I have been with theWeekly.  To give a little comfort to the oppressed, to write the truth exactly as I saw it, to make no compromises other than those of quality imposed by my own inadequacies, to be free to follow no master other than my own compulsions, to live up to my idealized image of what a true newspaperman should be, and still be able to make a living for my family — what more could a man ask?”

Here is the last verse that medic wrote in 1971 for his angry song (the first of which led off this piece):

But it’s seven, eight, nine,
Well, he finally died,
Tried to keep him alive,
but he lost the will to survive.
The agony that his life would have been,
Well, you say to yourself as you load up the hearse,
At least, it’s over this way, it could have been worse.

And here are a few words the extremely solemn 23-year-old Tom Engelhardt wrote to the dean of his school on rejecting a National Defense Fellowship grant to study China in April 1968.  (The “General Hershey” I refer to was the director of the Selective Service System which had issued a memo, printed in 1967 by the SDS publication New Left Notes, on “channeling” American manpower where it could best help the state achieve its ends.):

“On the morning of April 3, at the Boston Common, I turned in my draft card.  I felt this to be a reply to three different types of ‘channeling’ which I saw as affecting my own life.  First of all, it was a reply to General Hershey’s statement that manpower channeling ‘is the American or indirect way of achieving what is done by direction in foreign countries where choice is not permitted.’  I disassociated myself from the draft system, which was flagrantly attempting to make me live a life without freedom…

“Finally, I entered into resistance against an American government which was, with the help of the men provided by the draft, attempting the most serious type of ‘channeling’ outside our own country.  This is especially obvious in Vietnam where it denies the people of South Vietnam the opportunity to consider viable alternatives to their present government.  Moreover, as that attempt at ‘channeling’ (or, as it is called, ‘Winning the hearts and minds of the Vietnamese people’) met opposition, the American government, through its armed forces, committed acts of such unbelievable horror as to be unbearable to a thinking person.”

Stone’s sign-off, that medic’s song, and my letter all are documents from a time when Americans could be in opposition to, while also feeling in service to, their country.  In other words, they are documents from a lost world and so would, I suspect, have little meaning to the young of the present moment.  Can there be any question that today’s young are a volunteering crew, often gripped by the urge to help, to make this world of ours a better place?  Can you doubt as well that they are quite capable of rising to resist what’s increasingly grim in that terrible world, as the Occupy moment showed in 2011? Nor, I suspect, is the desire for a government that they could serve gone utterly, as indicated by the movement that formed around Barack Obama in his race for the presidency (and that he and his team essentially demobilized on entering the Oval Office).

What’s missing is any sense of connection to the government, any sense that it’s “ours” or that we the people matter.  In its place — and you can thank successive administrations for this — is the deepest sort of pessimism and cynicism about a national security state and war-making machine beyond our control.  And why protest what you can’t change?

[Note: Ron Unz of the Unz Review is archiving and posting a range of old publications, including all issues of I.F. Stone’s Weekly. This is indeed a remarkable service to the rest of us. To view the Weekly, click here. I.F. Stone’s family has also set up a website dedicated to the man and his work. To visit it, click here.]

 

http://www.alternet.org/environment/why-there-no-massive-antiwar-movement-america?akid=12749.265072.j2T6EP&rd=1&src=newsletter1031324&t=14

The Killing of America’s Creative Class

hqdefault

A review of Scott Timberg’s fascinating new book, ‘Culture Crash.’

Some of my friends became artists, writers, and musicians to rebel against their practical parents. I went into a creative field with encouragement from my folks. It’s not too rare for Millennials to have their bohemian dreams blessed by their parents, because, as progeny of the Boomers, we were mentored by aging rebels who idolized rogue poets, iconoclast cartoonists, and scrappy musicians.

The problem, warns Scott Timberg in his new book Culture Crash: The Killing of the Creative Class, is that if parents are basing their advice on how the economy used to support creativity – record deals for musicians, book contracts for writers, staff positions for journalists – then they might be surprised when their YouTube-famous daughter still needs help paying off her student loans. A mix of economic, cultural, and technological changes emanating from a neoliberal agenda, writes Timberg, “have undermined the way that culture has been produced for the past two centuries, crippling the economic prospects of not only artists but also the many people who supported and spread their work, and nothing yet has taken its place.”

 

Tech vs. the Creative Class

Timberg isn’t the first to notice. The supposed economic recovery that followed the recession of 2008 did nothing to repair the damage that had been done to the middle class. Only a wealthy few bounced back, and bounced higher than ever before, many of them the elites of Silicon Valley who found a way to harvest much of the wealth generated by new technologies. InCulture Crash, however, Timberg has framed the struggle of the working artist to make a living on his talents.

Besides the overall stagnation of the economy, Timberg shows how information technology has destabilized the creative class and deprofessionalized their labor, leading to an oligopoly of the mega corporations Apple, Google, and Facebook, where success is measured (and often paid) in webpage hits.

What Timberg glances over is that if this new system is an oligopoly of tech companies, then what it replaced – or is still in the process of replacing – was a feudal system of newspapers, publishing houses, record labels, operas, and art galleries. The book is full of enough discouraging data and painful portraits of artists, though, to make this point moot. Things are definitely getting worse.

Why should these worldly worries make the Muse stutter when she is expected to sing from outside of history and without health insurance? Timberg proposes that if we are to save the “creative class” – the often young, often from middle-class backgrounds sector of society that generates cultural content – we need to shake this old myth. The Muse can inspire but not sustain. Members of the creative class, argues Timberg, depend not just on that original inspiration, but on an infrastructure that moves creations into the larger culture and somehow provides material support for those who make, distribute, and assess them. Today, that indispensable infrastructure is at risk…

Artists may never entirely disappear, but they are certainly vulnerable to the economic and cultural zeitgeist. Remember the Dark Ages? Timberg does, and drapes this shroud over every chapter. It comes off as alarmist at times. Culture is obviously no longer smothered by an authoritarian Catholic church.

 

Art as the Province of the Young and Independently Wealthy

But Timberg suggests that contemporary artists have signed away their rights in a new contract with the market. Cultural producers, no matter how important their output is to the rest of us, are expected to exhaust themselves without compensation because their work is, by definition, worthless until it’s profitable. Art is an act of passion – why not produce it for free, never mind that Apple, Google, and Facebook have the right to generate revenue from your production? “According to this way of thinking,” wrote Miya Tokumitsu describing the do-what-you-love mantra that rode out of Silicon Valley on the back of TED Talks, “labor is not something one does for compensation, but an act of self-love. If profit doesn’t happen to follow, it is because the worker’s passion and determination were insufficient.”

The fact is, when creativity becomes financially unsustainable, less is created, and that which does emerge is the product of trust-fund kids in their spare time. “If working in culture becomes something only for the wealthy, or those supported by corporate patronage, we lose the independent perspective that artistry is necessarily built on,” writes Timberg.

It would seem to be a position with many proponents except that artists have few loyal advocates on either side of the political spectrum. “A working artist is seen neither as the salt of the earth by the left, nor as a ‘job creator’ by the right – but as a kind of self-indulgent parasite by both sides,” writes Timberg.

That’s with respect to unsuccessful artists – in other words, the creative class’s 99 percent. But, as Timberg disparages, “everyone loves a winner.” In their own way, both conservatives and liberals have stumbled into Voltaire’sCandide, accepting that all is for the best in the best of all possible worlds. If artists cannot make money, it’s because they are either untalented or esoteric elitists. It is the giants of pop music who are taking all the spoils, both financially and morally, in this new climate.

Timberg blames this winner-take-all attitude on the postmodernists who, beginning in the 1960s with film critic Pauline Kael, dismantled the idea that creative genius must be rescued from underneath the boots of mass appeal and replaced it with the concept of genius-as-mass-appeal. “Instead of coverage of, say, the lost recordings of pioneering bebop guitarist Charlie Christian,” writes Timberg, “we read pieces ‘in defense’ of blockbuster acts like the Eagles (the bestselling rock band in history), Billy Joel, Rush – groups whose songs…it was once impossible to get away from.”

Timberg doesn’t give enough weight to the fact that the same rebellion at the university liberated an enormous swath of art, literature, and music from the shadow of an exclusive (which is not to say unworthy) canon made up mostly of white men. In fact, many postmodernists have taken it upon themselves to look neither to the pop charts nor the Western canon for genius but, with the help of the Internet, to the broad creative class that Timberg wants to defend.

 

Creating in the Age of Poptimism

This doesn’t mean that today’s discovered geniuses can pay their bills, though, and Timberg is right to be shocked that, for the first time in history, pop culture is untouchable, off limits to critics or laypeople either on the grounds of taste or principle. If you can’t stand pop music because of the hackneyed rhythms and indiscernible voices, you’ve failed to appreciate the wonders of crowdsourced culture – the same mystery that propels the market.

Sadly, Timberg puts himself in checkmate early on by repeatedly pitting black mega-stars like Kanye West against white indie-rockers like the Decembrists, whose ascent to the pop-charts he characterizes as a rare triumph of mass taste.

But beyond his anti-hip-hop bias is an important argument: With ideological immunity, the pop charts are mimicking the stratification of our society. Under the guise of a popular carnival where a home-made YouTube video can bring a talented nobody the absurd fame of a celebrity, creative industries have nevertheless become more monotonous and inaccessible to new and disparate voices. In 1986, thirty-one chart-toppers came from twenty-nine different artists. Between 2008 and mid-2012, half of the number-one songs were property of only six stars. “Of course, it’s never been easy to land a hit record,” writes Timberg. “But recession-era rock has brought rewards to a smaller fraction of the artists than it did previously. Call it the music industry’s one percent.”

The same thing is happening with the written word. In the first decade of the new millennium, points out Timberg, citing Wired magazine, the market share of page views for the Internet’s top ten websites rose from 31 percent to 75 percent.

Timberg doesn’t mention that none of the six artists dominating the pop charts for those four years was a white man, but maybe that’s beside the point. In Borges’s “Babylon Lottery,” every citizen has the chance to be a sovereign. That doesn’t mean they were living in a democracy. Superstars are coming up from poverty, without the help of white male privilege, like never before, at the same time that poverty – for artists and for everyone else – is getting worse.

Essayists are often guilted into proposing solutions to the problems they perceive, but in many cases they should have left it alone. Timberg wisely avoids laying out a ten-point plan to clean up the mess, but even his initial thrust toward justice – identifying the roots of the crisis – is a pastiche of sometimes contradictory liberal biases that looks to the past for temporary fixes.

Timberg puts the kibosh on corporate patronage of the arts, but pines for the days of newspapers run by wealthy families. When information technology is his target because it forces artists to distribute their work for free, removes the record store and bookstore clerks from the scene, and feeds consumer dollars to only a few Silicon Valley tsars, Timberg’s answer is to retrace our steps twenty years to the days of big record companies and Borders book stores – since that model was slightly more compensatory to the creative class.

When his target is postmodern intellectuals who slander “middle-brow” culture as elitist, only to expend their breath in defense of super-rich pop stars, Timberg retreats fifty years to when intellectuals like Marshall McLuhan and Norman Mailer debated on network television and the word “philharmonic” excited the uncultured with awe rather than tickled them with anti-elitist mockery. Maybe television back then was more tolerable, but Timberg hardly even tries to sound uplifting. “At some point, someone will come up with a conception better than middlebrow,” he writes. “But until then, it beats the alternatives.”

 

The Fallacy of the Good Old Days

Timberg’s biggest mistake is that he tries to find a point in history when things were better for artists and then reroute us back there for fear of continued decline. What this translates to is a program of bipartisan moderation – a little bit more public funding here, a little more philanthropy there. Something everyone can agree on, but no one would ever get excited about.

Why not boldly state that a society is dysfunctional if there is enough food, shelter, and clothing to go around and yet an individual is forced to sacrifice these things in order to produce, out of humanistic virtue, the very thing which society has never demanded more of – culture? And if skeptics ask for a solution, why not suggest something big, a reorganization of society, from top to bottom, not just a vintage flotation device for the middle class? Rather than blame technological innovation for the poverty of artists, why not point the finger at those who own the technology and call for a system whereby efficiency doesn’t put people out of work, but allows them to work fewer hours for the same salary; whereby information is free not because an unpaid intern wrote content in a race for employment, but because we collectively pick up the tab?

This might not satisfy the TED Talk connoisseur’s taste for a clever and apolitical fix, but it definitely trumps championing a middle-ground littered with the casualties of cronyism, colonialism, racism, patriarchy, and all their siblings. And change must come soon because, if Timberg is right, “the price we ultimately pay” for allowing our creative class to remain on its crash course “is in the decline of art itself, diminishing understanding of ourselves, one another, and the eternal human spirit.”

 

http://www.alternet.org/news-amp-politics/killing-americas-creative-class?akid=12719.265072.45wrwl&rd=1&src=newsletter1030855&t=9

Portrait of the Artist as a Dying Class

highfidelity.web_850_593

Scott Timberg argues that we’ve lost the scaffolding of middle-class jobs—record-store clerk, critic, roadie—that made creative scenes thrive. Record store clerks—like Barry (Jack Black) in High Fidelity—are going the way of the dodo. (Getty Images)

BY JOANNA SCUTTS

It was livable, affordable, close-knit cities, with plenty of universities and plenty of cheap gathering places, that allowed art to flourish in 20th-century America.

Though Scott Timberg’s impassioned Culture Crash: The Killing of the Creative Class focuses on the struggles of musicians, writers and designers, it’s not just a story about (the impossibility of) making a living making art in modern America. More urgently, it’s another chapter in America’s central economic story today, of plutocracy versus penury and the evisceration of the middle class.

Timberg lost his job as an arts reporter at the Los Angeles Times in 2008 after real-estate mogul Sam Zell purchased the paper and gutted its staff. But newspapers are experiencing a natural dieoff, right? Wrong, says Timberg. He cites statistics showing that newspaper profits remained fat into the 21st century—peaking at an average of 22.3 percent in 2002—as the industry began slashing staff. The problem isn’t profitability but shareholder greed, and the fact that we’ve ceded so much authority to the gurus of economic efficiency that we’ve failed to check their math.

The story of print journalism’s demise is hardly new, but Timberg’s LA-based perspective brings architecture, film and music into the conversation, exposing the fallacy of the East Coast conviction that Hollywood is the place where all the money is hiding. Movie studios today are as risk-averse and profit-minded as the big New York publishing houses, throwing their muscle behind one or two stars and proven projects (sequels and remakes) rather than nurturing a deep bench of talent.

For aspiring stars to believe that they may yet become the next Kanye or Kardashian is as unrealistic as treating a casino as a viable path to wealth. Not only that, but when all the money and attention cluster around a handful of stars, there’s less variation, less invention, less risk-taking. Timberg notes that the common understanding of the “creative class,” coined by Richard Florida in 2002, encompasses “anyone who works with their mind at a high level,” including doctors, lawyers and software engineers.

But Timberg looks more narrowly at those whose living, both financially and philosophically, depends on creativity, whether or not they are highly educated or technically “white collar.” He includes a host of support staff: technicians and roadies, promoters and bartenders, critics and publishers, and record-store and bookstore autodidacts (he devotes a whole chapter to these passionate, vanishing “clerks.”) People in this class could once survive, not lavishly but respectably, leading a decent middle-class life, with even some upward mobility.

Timberg describes the career of a record-store clerk whose passion eventually led him to jobs as a radio DJ and a music consultant for TV. His retail job offered a “ladder into the industry” that no longer exists. Today, in almost all creative industries, the rungs of that ladder have been replaced with unpaid internships, typically out of reach to all but the children of the bourgeoisie. We were told the Internet would render physical locations unimportant and destroy hierarchies, flinging open the gates to a wider range of players. To an extent that Timberg doesn’t really acknowledge, that has proven somewhat true: Every scene in the world now has numerous points of access, and any misfit can find her tribe online. But it’s one thing to find fellow fans; it’s another to find paying work. It turns out that working as unfettered freelancers—one-man brands with laptops for offices—doesn’t pay the rent, even if we forgo health insurance.

Timberg points to stats on today’s music business, for instance, which show that even those who are succeeding, with millions of Twitter followers and Spotify plays, can scrape together just a few thousand dollars for months of work. (Timberg is cautiously optimistic about the arrival of Obamacare, which at least might protect people from the kinds of bankrupting medical emergencies that several of his subjects have suffered.).

In addition, Timberg argues that physical institutions help creativity thrive. His opening chapter documents three successful artistic scenes—Boston’s postwar poetry world, LA’s 1960s boom in contemporary art, and Austin’s vibrant 1970s alternative to the Nashville country-music machine. In analyzing what makes them work, he owes much to urban planner Jane Jacobs: It was livable, affordable, close-knit cities, with plenty of universities and plenty of cheap gathering places that allowed art to flourish in 20th-century America. In Austin, the university and the legislature provided day jobs or other support to the freewheeling artists, Timberg notes: “For all its protests of its maverick status, outlaw country was made possible by public funding.”

Today, affordability has gone out the window. As one freelance writer, Chris Ketcham, puts it, “rent is the basis of everything”—and New York and San Francisco, gouging relentlessly away at their middle class, are driving out the very people who built their unique cultures.

Take live music, for example. Without a robust support structure of people working for money, not just love—local writers who chronicle a scene, talented designers and promoters, bars and clubs that can pay the rent—live music is withering. Our minimum wage economy isn’t helping: For the venue and the band to cover their costs, they need curious music-lovers who have the time and money to come out, pay a cover charge, buy a beer or two and maybe an album. That’s a night out that fewer and fewer people can afford. Wealthy gentrifiers, meanwhile, would rather spend their evenings at a hot new restaurant than a grungy rock club. Foodie culture, Timberg suggests, has pushed out what used to nourish us.

Timberg is not a historian but a journalist, and his book is strongest when he allows creative people to speak for themselves. We hear how the struggles of a hip LA architect echo those of music professors and art critics. However, the fact that most of Timberg’s sources are men (and from roughly the same generation as the author), undercuts the book’s claim to universality. Those successful artistic scenes he cites at the beginning, in Boston, LA and Austin, and the mid-century heyday of American culture in general, were hardly welcoming to women and people of color.

It’s much harder to get upset about the decline of an industry that wasn’t going to let you join in the first place. Although Timberg admits this in passing, he doesn’t explore the way that the chipping away of institutional power might in fact have helped to liberate marginalized artists.

But all the liberation in the world counts for little if you can’t get paid, and Timberg’s central claim—that the number of people making a living by making art is rapidly decreasing—is timely and important, as is his argument that unemployed architects deserve our sympathetic attention just as much as unemployed auto workers.

The challenge is to find a way to talk about the essential role of art and creativity that doesn’t fall back on economic value, celebrity endorsement or vague mysticism. It’s far too important for that.

Joanna Scutts is a freelance writer based in Queens, NY, and a board member of the National Book Critics Circle. Her book reviews and essays have appeared in the Washington Post, the New Yorker Online, The Nation, The Wall Street Journal and several other publications. You can follow her on Twitter @life_savour.

http://inthesetimes.com/article/17522/portrait_of_the_dying_creative_class

Leo Tolstoy’s theory of everything

Before writing some of the greatest novels in history, Tolstoy asked some of philosophy’s hardest questions

Leo Tolstoy's theory of everything

Leo Tolstoy (Credit: Wikimedia)

Tolstoy’s first diary, started on March 17, 1847, at the age of eighteen, began as a clinical investigation launched under laboratory conditions: in the isolation of a hospital ward, where he was being treated for a venereal disease. A student at Kazan University, he was about to drop out due to lack of academic progress. In the clinic, freed from external influences, the young man planned to “enter into himself” for intense self-exploration (vzoiti sam v sebia ; 46:3). On the first page, he wrote (then crossed out) that he was in complete agreement with Rousseau on the advantages of solitude. This act of introspection had a moral goal: to exert control over his runaway life. Following a well-established practice, the young Tolstoy approached the diary as an instrument of self-perfection.

But this was not all. For the young Tolstoy, keeping a diary (as I hope to show) was also an experimental project aimed at exploring the nature of self: the links connecting a sense of self, a moral ideal, and the temporal order of narrative.

From the very beginning there were problems. For one, the diarist obviously found it difficult to sustain the flow of narrative. To fill the pages of his first diary, beginning on day two, Tolstoy gives an account of his reading, assigned by a professor of history: Catherine the Great’s famous Instruction (Nakaz), as compared with Montesquieu’sL’Esprit de lois. This manifesto aimed at regulating the future social order, and its philosophical principles, rooted in the French Enlightenment (happy is a man in whom will rules over passions, and happy is a state in which laws serve as an instrument of such control), appealed to the young Tolstoy. But with the account of Catherine’s utopia (on March 26), Tolstoy’s first diary came to an end.

When he started again (and again), Tolstoy commented on the diary itself, its purpose and uses. In his diary, he will evaluate the course of self- improvement (46:29). He will also reflect on the purpose of human life (46:30). The diary will contain rules pertaining to his behavior in specific times and places; he will then analyze his failures to follow these rules (46:34). The diary’s other purpose is to describe himself and the world (46:35). But how? He looked in the mirror. He looked at the moon and the starry sky. “ But how can one write this ?” he asked. “One has to go, sit at an ink-stained desk, take coarse paper, ink . . . and trace letters on paper. Letters will make words, words—phrases, but is it possible to convey one’s feeling?” (46:65). The young diarist was in despair.



Apart from the diaries, the young Tolstoy kept separate notebooks for rules: “ Rules for Developing Will ” (1847), “Rules of Life” (1847), “Rules” (1847 and 1853), and “Rules in General” (1850) (46:262–76). “Rules for playing music” (46:36) and “Rules for playing cards in Moscow until January 1” (46: 39). There are also rules for determining “(a) what is God, (b) what is man, and (c) what are the relations between God and man” (46:263). It would seem that in these early journals, Tolstoy was actually working not on a history but on a utopia of himself: his own personal Instruction.

Yet another notebook from the early 1850s, “Journal for Weaknesses” (Zhurnal dlia slabostei)—or, as he called it, the “Franklin journal”—listed, in columns, potential weaknesses, such as laziness, mendacity, indecision, sensuality, and vanity, and Tolstoy marked (with small crosses) the qualities that he exhibited on a particular day. Here, Tolstoy was consciously following the method that Benjamin Franklin had laid out in his famous autobiography. There was also an account book devoted to financial expenditures. On the whole, on the basis of these documents, it appears that the condition of Tolstoy’s moral and monetary economy was deplorable. But another expenditure presented still graver problems: that of time.

Along with the first, hesitant diaries, for almost six months in 1847 Tolstoy kept a “Journal of Daily Occupations” (Zhurnal ezhednevnykh zaniatii; 46:245–61), the main function of which was to account for the actual expenditure of time. In the journal, each page was divided into two vertical columns: the first one, marked “The Future,” listed things he planned to do the next day; a parallel column, marked “The Past,” contained comments (made a day later) on the fulfillment of the plan. The most frequent entry was “not quite” (nesovsem). One thing catches the eye: there was no present.

The Moral Vision of Self and the Temporal Order of Narrative

Beginning in 1850, the time scheme of Tolstoy’s “Journal of Daily Occupations” and the moral accounting of the Franklin journal were incorporated into a single narrative. Each day’s entry was written from the reference point of yesterday’s entry, which ended with a detailed schedule for the next day—under tomorrow’s date. In the evening of the next day, Tolstoy reviewed what he had actually done, comparing his use of time to the plan made the previous day. He also commented on his actions, evaluating his conduct on a general scale of moral values. The entry concluded with a plan of action and a schedule for yet another day. The following entry (from March 1851) is typical for the early to mid-1850s:

24. Arose somewhat late and read, but did not have time to write. Poiret came, I fenced, and did not send him away (sloth and cowardice). Ivanov came, I spoke with him for too long (cowardice). Koloshin (Sergei) came to drink vodka, I did not escort him out (cowardice). At Ozerov’s argued about nothing (habit of arguing) and did not talk about what I should have talked about (cowardice). Did not go to Beklemishev’s (weakness of energy). During gymnastics did not walk the rope (cowardice), and did not do one thing because it hurt (sissiness).—At Gorchakov’s lied (lying). Went to the Novotroitsk tavern (lack of fierté). At home did not study English (insufficient firmness). At the Volkonskys’ was unnatural and distracted, and stayed until one in the morning (distractedness, desire to show off, and weakness of character). 25. [This is a plan for the next day, the 25th, written on the 24th—I.P.] From 10 to 11 yesterday’s diary and to read. From 11 to 12—gymnastics. From 12 to 1—English. Beklemishev and Beyer from 1 to 2. From 2 to 4—on horseback. From 4 to 6—dinner. From 6 to 8—to read. From 8 to 10—to write.—To translate something from a foreign language into Russian to develop memory and style.—To write today with all the impressions and thoughts it gives rise to.—25. Awoke late out of sloth. Wrote my diary and did gymnastics, hurrying. Did not study English out of sloth. With Begichev and with Islavin was vain. At Beklemishev’s was cowardly and lack of fierté. On Tver Boulevardwanted to show off. I did not walk on foot to the Kalymazhnyi Dvor (sissiness). Rode with a desire to show off. For the same reason rode to Ozerov’s.—Did not return to Kalymazhnyi, thoughtlessness. At the Gorchakovs’ dissembled and did not call things by their names, fooling myself. Went to L’vov’s out of insufficient energy and the habit of doing nothing. Sat around at home out of absentmindedness and read Werther inattentively, hurrying. 26 [This is a plan for the next day, the 26th, written on the 25th—I.P.] To get up at 5. Until 10—to write the history of this day. From 10 to 12—fencing and to read. From 12 to 1—English, and if something interferes, then in the evening. From 1 to 3—walking, until 4—gymnastics. From 4 to 6, dinner—to read and write.— (46:55).

An account of the present as much as a plan for the future, this diary combines the prescriptive and the descriptive. In the evening of each day, the young Tolstoy reads the present as a failure to live up to the expectations of the past, and he anticipates a future that will embody his vision of a perfect self. The next day, he again records what went wrong today with yesterday’s tomorrow. Wanting reality to live up to his moral ideal, he forces the past to meet the future.

In his attempt to create an ordered account of time, and thus a moral order, Tolstoy’s greatest difficulty remains capturing the present. Indeed, today makes its first appearance in the diary as tomorrow, embedded in the previous day and usually expressed in infinitive verb forms (“to read,” “to write,” “to translate”). On the evening of today, when Tolstoy writes his diary, today is already the past, told in the past tense. His daily account ends with a vision of another tomorrow. Since it appears under tomorrow’s date, it masquerades as today, but the infinitive forms of the verbs suggest timelessness.

In the diaries, unlike in the “Journal of Daily Occupations,” the present is accorded a place, but it is deprived of even a semblance of autonomy: The present is a space where the past and the future overlap. It appears that the narrative order of the diary simply does not allow one to account for the present. The adolescent Tolstoy’s papers contain the following excerpt, identified by the commentators of Tolstoy’s complete works as a “language exercise”: “Le passé est ce qui fut, le futur est ce qui sera et le présent est ce qui n’est pas.—C’est pour cela que la vie de l’homme ne consiste que dans le futur et le passé et c’est pour la même raison que le bonheur que nous voulons posséder n’est qu’une chimère de même que le présent” (1:217).  (The past is that which was, the future is that which will be, and the present is that which is not. That is why the life of man consists in nothing but the future and the past, and it is for the same reason that the happiness we want to possess is nothing but a chimera, just as the present is.) Whether he knew it or not, the problem that troubled the young Tolstoy, as expressed in this language exercise, was a common one, and it had a long history.

What Is Time? Cultural Precedents

It was Augustine, in the celebrated Book 11 of the Confessions, who first expressed his bewilderment: “What is time?” He argued as follows: The future is not yet here, the past is no longer here, and the present does not remain. Does time, then, have a real being? What is the present? The day? But “not even one day is entirely present.” Some hours of the day are in the future, some in the past. The hour? But “one hour is itself constituted of fugitive moments.”

Time flies quickly from future into past. In Augustine’s words, “the present occupies no space.” Thus, “time” both exists (the language speaks of it and the mind experiences it) and does not exist. The passage of time is both real and unreal (11.14.17–11.17.22). Augustine’s solution was to turn inward, placing the past and the future within the human soul (or mind), as memory and expectation. Taking his investigation further, he argues that these qualities of mind are observed in storytelling and fixed in narrative: “When I am recollecting and telling my story, I am looking on its image in present time, since it is still in my memory” (11.18.23). As images fixed in a story, both the past and the future lie within the present, which thus acquires a semblance of being. In the mind, or in the telling of one’s personal story, times exist all at once as traces of what has passed and will pass through the soul. Augustine thus linked the issue of time and the notion of self. In the end, the question “What is time?” was an extension of the fundamental question of the Confessions: “What am I, my God? What is my Nature?” (10.17.26).

For centuries philosophers continued to refine and transform these arguments. Rousseau reinterpreted Augustine’s idea in a secular perspective, focusing on the temporality of human feelings. Being attached to things outside us, “our affections” necessarily change: “they recall a past that is gone” or “anticipate a future that may never come into being.” From his own experience, Rousseau knew that the happiness for which his soul longed was not one “composed of fugitive moments” (“ le bonheur que mon coeur regrette n’est point composé d’instants fugitives ”) but a single and lasting state of the soul. But is there a state in which the soul can concentrate its entire being, with no need to remember the past or reach into the future? Such were Rousseau’s famous meditations on time in the fifth of his Reveries of the Solitary Walker (Rêveries du promeneur solitaire), a sequel to the Confessions. In both texts Rousseau practiced the habit of “reentering into himself,” with the express purpose of inquiring “What am I?” (“Que suis je moi-même ?”).

Since the mid-eighteenth century, after Rousseau and Laurence Sterne, time, as known through the mind of the perceiving individual, had also been the subject of narrative experiments undertaken in novels and memoirs. By the 1850s, the theme of the being and nonbeing of time in relation to human consciousness, inaugurated by Augustine and secularized by Rousseau, could serve as the topic of an adolescent’s language exercise.

In his later years, as a novelist, Tolstoy would play a decisive role in the never-ending endeavor to catch time in the act. In the 1850s, in his personal diary, the young Tolstoy was designing his first, homemade methods of managing the flow of personal time by narrative means. As we have seen, this dropout student was not without cultural resources. The young Tolstoy could hardly have known Augustine, but he did know Rousseau, whose presence in the early diaries is palpable. (In later years, when he does read Augustine, he will focus on the problem of narrating time and fully appreciate its theological meaning.)  But mostly he proceeded by way of his own narrative efforts: his diary. Fixed in the diary, the past would remain with him; planned in writing, the future was already there. Creating a future past and a present future, the diarist relieved some of the anxieties of watching life pass. But in one domain his efforts fell short of the ideal: not even one day was entirely present.

“A History of Yesterday”

In March 1851, the twenty-two-year-old Tolstoy embarked on another longplanned project: to write a complete account of a single day—a history of yesterday. His choice fell on March 24: “ not because yesterday was extraordinary in any way . . . but because I have long wished to tell the innermost [zadushevnuiu] side of life in one day. God only knows how many diverse . . . impressions and thoughts . . . pass in a single day. If it were only possible to recount them all so that I could easily read myself and others could read me as I do. . . . ” (1:279).

An outgrowth of the diary, “A History of Yesterday” (Istoriia vcherashnego dnia) was conceived as an experiment: Where would the process of writing take him? (Tolstoy was writing for himself alone; indeed, in his lifetime, “A History of Yesterday” remained unpublished.)

The metaphor of self, or life, as a book, an image to which Tolstoy would return throughout his life, makes its first appearance here. 8 Rousseau, in whose footsteps Tolstoy followed in wanting to make himself into an open book, believed that self-knowledge was based on feeling and that all he had to do was “to make my soul transparent to the reader.” The young Benjamin Franklin, who was a printer, used the image in his own epitaph: He imagined a typeset book of his life and expressed his belief that it would appear once more in a new edition, “revised and corrected by the author.”

Tolstoy, in 1851, seems to have suspected that the problem lay in the narrative itself. Knowing that “there is not enough ink in the world to write such a story, or typesetters to put it into print” (1:279), he nevertheless embarked upon this project.

In the end it turned out that after about twenty-four hours of writing (spread over a three-week period), Tolstoy was still at the start of the day. Having filled what amounts to twenty-six pages of printed text, he abandoned his “History.” By that time Tolstoy was in a position to know that the enterprise was doomed, and not only because of empirical difficulties (“there would not be enough ink in the world to write it, or typesetters to put it in print”), but also because of major philosophical problems (such as the constraints inherent in the nature of narrative).

“A History of Yesterday” starts in the morning: “I arose late yesterday—at a quarter to 10.” What follows is a causal explanation that relates the given event to an earlier event, which happened on the day before yesterday: “— because I had gone to bed after midnight.” At this point, the account is interrupted by a parenthetical remark that places the second event within a system of general rules of conduct: “( It has long been my rule never to retire after midnight, yet this happens to me about 3 times a week).” The story resumes with a detailed description of those circumstances which had led to the second event and a minor moral transgression (going to bed after midnight): “I was playing cards. . . .” (1:279). The account of the action is then interrupted by another digression—the narrator’s reflections on the nature of society games.

After a page and a half, Tolstoy returns to the game of cards. The narrative proceeds, slowly and painfully, tracing not so much external actions as the webs of the protagonist/narrator’s mental activity, fusing two levels of reflections: those that accompanied the action and those that accompany the act of narration. After many digressions, the narrative follows the protagonist home, puts him to bed, and ends with an elaborate description of his dream, leaving the hero at the threshold of “yesterday.”

What, then, is time? In Tolstoy’s “History,” the day (a natural unit of time) starts in the morning, moves rapidly to the previous evening, and then slowly makes its way back towards the initial morning. Time flows backward, making a circle. In the end, Tolstoy wrote not a history of yesterday but a history of the day before yesterday.

This pattern would play itself out once again in Tolstoy’s work when, in 1856, he started working on a historical novel, the future War and Peace. As he later described it (in an explanatory note on War and Peace), Tolstoy’ original plan was to write a novel about the Decembrists. He set the action in the present, in 1856: An elderly Decembrist returns to Moscow from Siberian exile. But before Tolstoy could move any further, he felt compelled to interrupt the narrative progression: “ involuntarily I passed from today to 1825 ”(that is, to the Decembrist uprising). In order to understand his hero in 1825, he then turned to the formative events of the war with Napoleon: “ I once again discarded what I had begun and started to write from the time of 1812.” “But then for a third time I put aside what I had begun”—Tolstoy now turned to 1805 (the dawn of the Napoleonic age in Russia; 13:54). The narrative did not progress in time; it regressed. In both an early piece of personal history, “A History of Yesterday,” and the mature historical novel, War and Peace, Tolstoy saw the initial event as the end of a chain of preceding events, locked into causal dependency by the implications of the narrative order. At the time he made this comment on the writing of his novel, Tolstoy seemed to hold this principle as the inescapable logic of historical narrative.

In “A History of Yesterday,” temporal refraction does not end with a shift from the target day to the preceding day. In the description of “the day before yesterday” itself, time also does not progress: It is pulled apart to fit an array of simultaneous processes. The game of cards has come to an end. The narrator is standing by the card table involved in a (mostly silent) conversation with the hostess. It is time to leave, but taking leave does not come easily to the young man; nor is it easy to tell the story of leaving:

I looked at my watch and got up . . . . Whether she wished to end this conversation which I found so sweet, or to see how I would refuse, or whether she simply wished to continue playing, she looked at the figures which were written on the table, drew the chalk across the table— making a figure that could be classified neither as mathematical nor as pictorial—looked at her husband, then between him and me, and said: “Let’s play three more rubbers.” I was so absorbed in the contemplation not of her movements alone, but of everything that is called charme, which it is impossible to describe, that my imagination was very far away, and I did not have time to clothe my words in a felicitous form; I simply said: “No, I can’t.” Before I had finished saying this I began to regret it,—that is, not all of me, but a certain part of me. . . .

—I suppose this part spoke very eloquently and persuasively (although I cannot convey this), for I became alarmed and began to cast about for arguments.—In the first place, I said to myself, there is no great pleasure in it, you do not really like her, and you’re in an awkward position; besides, you’ve already said that you can’t stay, and you have fallen in her estimation. . . .

Comme il est aimable, ce jeune homme.” [How pleasant he is, this young man.]

This sentence, which followed immediately after mine, interrupted my reflections. I began to make excuses, to say I couldn’t stay, but since one does not have to think to make excuses, I continued reasoning with myself: How I love to have her speak of me in the third person. In German this is rude, but I would love it even in German. . . . “Stay for supper,” said her husband.—As I was busy with my reflections on the formula of the third person, I did not notice that my body, while very properly excusing itself for not being able to stay, was putting down the hat again and sitting down quite coolly in an easy chair. It was clear that my mind was taking no part in this absurdity. (1:282–83)

Written from memory, in the past tense, this narrative nevertheless strives to imitate a notation of immediate experience—something like a stenographic transcription of a human consciousness involved in the act of apprehending itself.

Some critics see this as an early instance of what would later be called the “stream of consciousness” or even read Tolstoy’s desire to describe what lies “behind the soul” as an attempt to reach “what we now call the subconscious.”  But this is a special case: a stream of consciousness with an observer. As an external observer, the narrator can only guess at what is going on in the other’s mind. As a self-narrator who describes the zadushevnui  —“innermost,” or, translating literally, the “behind-the-soul”—side of one day’s life, he faces other difficulties.

Indeed, the narrator deals with internal multiplicity, with speech, thought, and bodily movement divided, with ambivalent desires, with the dialectical drama that stands behind a motive. There is yet another layer: the splitting of the self into a protagonist and a narrator, who operate in two different timeframes. Moreover, the narrator (even when he is lost in reverie) is involved in reflections not only on the process of narrating but also on general (meta-) problems in the “historiography” of the self. Finally, he keeps referring to the residue of that which cannot be expressed and explained. How could such multiplicity be represented in the linear order of a narrative?

Time and Narrative 

Unbeknownst to the young Tolstoy, Kant had long since deplored the limitations of narrative in The Critique of Pure Reason. In narrative representation, one event as a matter of convention follows upon another. In Kant’s words, “the apprehension of the manifold of appearance is always successive”; “the representations of the parts” succeed one another. It does not follow, however, that what we represent is also in itself successive; it is just that we “cannot arrange the apprehension otherwise than in this very succession.” This is the way “in which we are first led to construct for ourselves the concept of cause”: succession suggests causality.

As yet unfamiliar with Kant’s deductions, Tolstoy attempted to break the rule of succession—to stretch the temporality of his narrative in order to account for actions and processes that occur as if simultaneously. As a result, he extended time beyond the endurance of the narrative form: the story breaks off. The narrator who describes his own being from within knows (if only subconsciously) more than he can possibly tell. Is it humanly possible to give an account of even one day in one’s own life?

There were, of course, cultural precedents. Tolstoy’s narrative strategies were largely borrowed from Laurence Sterne, who, along with Rousseau, was among his first self-chosen mentors. 13 In 1851, in his diary, Tolstoy called Sterne his “favorite writer” (49:82). In 1851–52, he translated A Sentimental Journey from English into Russian as an exercise.

Informed by Locke’s philosophy, Sterne’s narrative strategy was to make the consciousness of the protagonist/narrator into a locus of action. Locke, unlike Augustine, hoped that time itself could be captured: He derived the idea of time (duration) from the way in which we experience a train of ideas that constantly succeed one another in our minds. It followed that the sense of self derives from the continuity of consciousness from past to future.

Sterne followed suit by laying bare the flow of free associations in the mind of the narrator. One of his discoveries concerned time and narrative: Turning the narration inward, Sterne discovered that there is a psychic time that diverges from clock time. The splitting of time results in living, and writing, simultaneously on several levels. To be true to life, Sterne’s narrator digresses. The author confronted the necessity for interweaving movements forward and backward, which alone promised to move beyond the confines of time. The combination of progression and digression, including retrospective digression, created a narrative marked by experimentation, with the narrator openly commenting on his procedures.  In the end, Sterne’s experimentation—his “realistic” representation—revealed flaws in Locke’s argument: Successive representation could not catch up with the manifold perceptions of the human mind. In brief, the narrative that attempted to represent human consciousness did not progress.

By mimicking Sterne’s narrative strategy, Tolstoy learned his first lessons in epistemology: the Cartesian shift to the point of view of the perceiving individual, the modern view on the train and succession of inner thoughts, the dependence of personal identity on the ability to extend consciousness backward to a past action, and so on. Tolstoy also confronted the restrictions that govern our apprehension and representation of time—limitations that he would continue to probe and challenge throughout his life and work, even after he had read, and fully appreciated, Kant’s Critique of Pure Reason (in 1869, as he was finishing War and Peace).

In his first diaries and in “A History of Yesterday,” proceeding by way of narrative experiments, the young Tolstoy discovered a number of things. He discovered that there was no history of today. Even in a record almost concurrent with experience, there was no present. A history was a history of yesterday. Moreover, writing a history of the individual and a self-history, he was confronted with the need to account not only for the order of events but also for a whole other domain: the inner life. Uncovering the inner life led to further temporal refraction: From an inside point of view, it appeared that behind an event or action there stood a whole array of simultaneous processes. This led to another discovery.

Excerpted from “’Who, What Am I?’: Tolstoy Struggles to Narrate the Self” by Irina Paperno. Copyright © 2014 by Irina Paperno. Reprinted by arrangement with Cornell University Press. All rights reserved.

http://www.salon.com/2015/01/11/leo_tolstoys_theory_of_everything/?source=newsletter