Dying, with a lifetime of literature 

When I was diagnosed with a terminal illness, I didn’t expect the books I taught for 30 years to define how I coped

Dying, with a lifetime of literature 
(Credit: Shutterstock/Penguin/Salon)

When I was diagnosed with a terminal illness, I was determined to not let the disease define me. With the exception of one fundraiser, I declined offers to give talks, blog, or even write a narrative essay about my struggles with the debilitating symptoms of amyotrophic lateral sclerosis (or ALS). While I held to my promise, I hadn’t expected that the classic literature I’d been teaching for 30 years would define how I coped with my illness.

I was diagnosed with ALS in August 2015 — on the Friday before school was to resume. I slipped into my classroom that weekend and filled a box with mementos, leaving behind my personal copies of literary masterpieces and cabinets filled with curricula — at least I thought I was abandoning a lifetime of literature.

At first, my only thoughts of school were met with relief. Relief that I had left my job before greeting 150-plus new students, taking them on a journey of uncertainty and loss; at 17, they had plenty else to worry about.

As the months passed, however, my physical strength waned, and unlike the industrious doer I’d been my whole life, I became dependent upon others. After losing the use of my hands, it became difficult to find meaningful ways to spend my time. I despaired having no control over my life. I strove to focus on the moments of the day when I was warmed by a kind word or an image of natural beauty. When I did pause to appreciate these instances, I’d hear the words “This one is warmed . . .”

At first I was at a loss for the source of the line. I was certain it was from a Toni Morrison novel, but when I consulted Google, I was reminded that the phrase harkened from Morrison’s Nobel Prize acceptance speech. Near the conclusion of her lecture, she tells a brief story about a wagon filled with slaves journeying to a plantation where their lives will end. The driver stops at an inn for a meal, leaving the slaves shivering in the back of the wagon. Two children tend to the slaves, giving them food and sips of warm cider. Morrison sums up this respite from pain and impending hopelessness: “The next stop would be their last, but this one was warmed.” I too was nearly at my last stop — death — but pausing to appreciate the moments that were warmed by small gestures and glimpses of natural beauty dulled the pangs of despair, and I had Morrison to thank for expressing the ineffable emotions that I may have missed had it not been for her words echoing in my mind.

With my mobility limited and my voice diminished, I would often lie in bed and find myself bothered by ridiculous things: a shriveled leaf on a house plant or a crooked lampshade. When someone entered the room to visit, I would seek a way to ask them to correct the irritant. But if they plucked the wrong leaf or didn’t understand me at all, I would usually realize the foolishness of wasting energy on getting my way and somewhere from the recesses of my memory be reminded, “Do not seek to be master of all . . . ”

At first I assumed those words harkened from the New Testament or possibly the humanist Shakespeare, but when Google insisted that Sophocles quilled those lines, I found myself sharing the tragic stage with Oedipus as his brother-in-law Creon admonishes him for failing to learn that fate cannot be circumvented. Oedipus and I both had to learn acceptance. Although acceptance sounds like a passive stance, it would become the hardest work of my life.

Acquiescing to my fate and allowing others to do what they deemed best for my unfamiliar and uncooperative body took patience. But when my confinement to a wheelchair required dismantling my office into a bedroom, replacing my desk with a ramp and my bookcase with a portable commode, I balked. Like Kafka’s Gregor Samsa, who awakened to find himself a giant cockroach, I felt alien in appearance and among unfamiliar surroundings. In an attempt to make me more comfortable, my family, like Gregor’s, had replaced my furniture — my identity — with utility. I slept in a motorized bed with bars and awoke alone, with a green button to summon my husband to untangle my limp legs from the blankets. Despite the love and care I was receiving, nothing except the occasional dream let me pretend that I was myself.

Eventually all my inner turmoil will have to give way to complete surrender. I’m not quite there yet. But I do hear one of Hamlet’s less-famous lines spoken after most of the chaos in the play subsidies: “Let be.” Resonating in those two words is Hamlet’s acceptance: “There is special providence in the fall of a sparrow.” I pray that I may soon die accepting this lesson that’s taken a lifetime to learn.

Lynette Williamson taught high school English and coached debate for 30 years in Sonoma County, California, where she and her husband Don raised two children who had better keep their promise to their mother and produce grandchildren one day. 

It’s not an attack on the arts, it’s an attack on communities

Art and architecture critic March 16 at 3:03 PM
Things could get worse, much worse. The president’s proposed budget eliminates much of the government’s long-standing commitment to the arts, to science, to education, to culture, to public broadcasting and community development. It calls not only for the elimination of the National Endowment for the Arts, the National Endowment for the Humanities, the Corporation for Public Broadcasting and the Institute of Museum and Library Services, but also proposes the elimination of groups such as the Woodrow Wilson Center, a highly respected think tank that studies national and international affairs and just happens to be hosting a program Thursday called “The Muse of Urban Delirium: How the Performing Arts Paradoxically Transform Conflict-Ridden Cities Into Centers of Cultural Innovation.” It’s almost as if someone tried to fit as many dirty words — dirty in the current administration’s way of thinking — into one evening: Arts, Cities, Culture, Paradox, Innovation.

These cuts aren’t about cost savings — they’re far too small to make even a ding in the federal budget. They are carefully calculated attacks on communities, especially those that promote independent thinking and expression, or didn’t line up behind the Trump movement as it swept to power through the electoral college in November. But the president’s proposed budget also includes attacks on communities that did indeed support Trump but that are too powerless to resist. Among the independent agencies set for elimination: the Appalachian Regional Commission, which supports things such as job training, economic diversification (including the arts), tourism initiatives and Internet access in states like West Virginia, Alabama and Kentucky.

The strategy, perfectly calculated for a new era of rancor and resentment amplified by social media, is to focus people not on what will be lost, but who will lose. Why attack communities that support you? Because losing isn’t just a question of what side, what arguments, what ideology prevails in the political debate. Rather, losing is a stigma, a scarlet letter to hang on the necks of people who are losers. Losers are essential to the project of building a new political coalition, a coalition that celebrates winning. Winners are strong; losers are sad. If your aversion to being branded a loser is strong enough, you may even embrace policies that cause you harm.

President Trump’s proposed budget calls for the elimination of the National Endowment for the Arts, the National Endowment for the Humanities, the Institute of Museum and Library Services, and the Corporation for Public Broadcasting. Small and rural programs would be hit hardest. (Erin Patrick O’Connor/The Washington Post)

Read through The Washington Post’s coverage of the budget proposal, and you hear what begins to sound like a broken record: These cuts will primarily affect marginalized or minority communities, people on the losing end of the American Dream. From an article about the Interior Department: “Historic-sites funding is important,” according to one expert, “because it supports tribal preservation officers and provides grants to underrepresented communities.” Or from the Labor Department: “The Trump administration proposed $2.5 billion in cuts for the Labor Department in a plan that would significantly reduce funding for job training programs for seniors and disadvantaged youth.”

Just in time for today’s announcement is an op-ed by Washington Post columnist George Will, who also calls for the elimination of the NEA. Will’s article would be a risible period piece — he is still seething over culture-war debates from more than a quarter century ago — if his hostility to the arts were not politically empowered by the democratic peculiarities of the last election, which brought into office a deeply unpopular president allied (for now) to a Congress pursuing deeply unpopular policies because many of its members are protected by gerrymandering.

Will rehashes the usual arguments: He reminds readers of a handful of grants that were deemed offensive by some in the early 1990s; he asserts that people will pay for the arts if they want the arts, and that state and local arts agencies will step up if the federal government (which helps fund these agencies) forsakes them; and argues that the arts are no different, no more a social good, have no more utility or spiritual value than “macaroni and cheese.” He not only fails to understand the nature of the arts, he also fails to understand the uniquely American three-legged stool system of federal stimulus allied to state and local support and bolstered by private donations that has enriched the arts and the country for more than half a century.

“The myriad entities with financial interests in preserving the NEA cloyingly call themselves the ‘arts community,’ a clever branding that other grasping factions should emulate,” he writes, cloyingly. “The ‘arts community’ has its pitter-patter down pat. The rhetorical cotton candy — sugary, jargon-clotted arts gush — asserts that the arts nurture ‘civically valuable dispositions’ and a sense of ‘community and connectedness.’ And, of course, ‘diversity’ and ‘self-esteem.’ ”

The arts have a powerful economic effect on our society and employ vast numbers of people, but the arts community is hardly an assemblage of cynical, self-interested, deep-pocketed financial interests (for that, look to the president’s Cabinet). The “pitter-patter” of this rapacious arts juggernaut is indeed well practiced by now, but only because attacks on the arts are now a seasonal performance from a determined minority political faction. The arts do indeed foster a sense of “community and connectedness” . . . in places like Nebraska, Alaska, Missouri, Nevada, Georgia, Tennessee and Alabama. And the other 43 states of the Union. And not only do they nurture diversity, they also express and preserve the variegated richness of culture celebrated in that musty old Latin phrase “E pluribus unum” (it’s on the money, if you want to check).

But the most jejune moment of Will’s extraordinary performance is this: “What, however, is art? We subsidize soybean production, but at least we can say what soybeans are.” For a few centuries now, it has been the nature of art to wonder what art is. That’s how the arts think, how they operate, how they define the parameters of aesthetic experience. And for the entire history of the species, art has been fundamentally different, less tangible, less utilitarian in its function, than soybeans. These things are obvious, if you’ve ever spent time with the arts community, which in fact exists and adds immeasurably to the stability, cohesion, intelligence, beauty and resilience of the nation.

The Doors took their name from the title of Aldous Huxley’s book ‘The Doors of Perception’

Feb 10, 2017

The Doors, one of the most influential and revolutionary rock bands of the sixties, were formed in Los Angeles in 1965 by UCLA film students Jim Morrison and Ray Manzarek.

It all started on LA’s Venice Beach in July 1965, when Morrison told Manzarek that he had been writing songs and sang ‘Moonlight Drive’ to him. Manzarek was speechless; he had never heard lyrics to a rock song like that before.

Promotional photo of The Doors.

Manzarek, an organist, had just formed a band called Rick & the Ravens with his brothers Rick and Jim, and since they were searching for a vocalist and drummer, he asked Morrison to join them. Drummer John Densmore of The Psychedelic Rangers joined the band and soon after they recorded six Morrison songs: Moonlight Drive, My Eyes Have Seen You, Hello, I Love You, Go Insane, End Of The Night, and Summer’s Almost Gone. Manzarek’s brothers Rick and Jim didn’t like the recordings and decided to leave the band. John’s friend, guitarist Robbie Krieger, who was previously also a member of The Psychedelic Rangers, then joined the band. They never found a new bass player, so Manzarek played bass on his organ. They renamed the band to “The Doors.”

Let’s break on through to the other side and find out what influenced The Doors to name their band. It was Jim Morrison who proposed the name The Doors to his band mates. He was inspired by William Blake via Aldous Huxley’s book on mescaline, The Doors of Perception.

Morrison chose the band’s name after reading Aldous Huxley’s The Doors of Perception, which got its title from a quote in a book written by William Blake, “The Marriage of Heaven and Hell.” The quote is as follows, “If the doors of perception were cleansed, everything would appear to man as it is, infinite. For man has closed himself up, till he sees all things thro’ narrow chinks of his cavern.”

One of the copies of William Blake’s unique hand-painted editions, created for the original printing of the poem. The line from which Huxley draws the title is in the second to last paragraph. This image represents Copy H, Plate 14 of The Marriage of Heaven and Hell which is currently held at The Fitzwilliam Museum.

Apparently, the works of William Blake and Aldous Huxley influenced not just Jim Morrison, but also Ray Manzarek. In 1967, Newsweek published an article about the Doors titled “This Way to the Egress,” where Manzarek was quoted discussing the name of the band:

There are things you know about,” says 25-year-old Manzarek, whose specialty is playing the organ with one hand and the bass piano with the other, “and things you don’t, the known and the unknown, and in between are the doors – that’s us. We’re saying that you’re not only spirit, you’re also this very sensuous being. That’s not evil, that’s a really beautiful thing. Hell appears so much more fascinating and bizarre than heaven. You have to ‘break on through to the other side’ to become the whole being.”

The Doors went on to become one of the most famous rock bands. In January 1967, their debut album was released and was a massive hit, reaching number two on the US chart. The same year in October they released their second album ‘Strange Days,’ which was also well received. ‘Waiting for the Sun’ was released in 1968 and that year the band made their first performance outside of North America.

They performed throughout Europe, including a show in Amsterdam where Morrison collapsed on stage after a drug binge. In June 1969, they released ‘The Soft Parade,’ and next year they released their fifth studio album, ‘Morrison Hotel.’ In 1971, soon after ‘L.A. Woman’ was recorded, Morrison moved to Paris to concentrate on his writing. On 3 July 1971, his body was found in the bathtub in his apartment. The rock legend apparently died of a drug overdose.

Jim Morrison’s grave at the Père Lachaise Cemetery in Paris. Photo Credit

The rest of the band tried to continue without him and released two more albums, but the band eventually split in 1973.

 

Why do conservatives want the government to defund the arts?

The cuts are largely driven by an ideology to shrink the federal government and decentralize power

Why do conservatives want the government to defund the arts?
Vinila Dasgupta retouches her art during India Art Fair in New Delhi, India, Thursday, Feb. 2, 2017. The four day art fair brings together a number of modern and contemporary artists to present their works. ((Credit: AP Photo/Tsering Topgyal))
This article was originally published on The Conversation.

Recent reports indicate that Trump administration officials have circulated plans to defund the National Endowment of the Arts (NEA), putting this agency on the chopping block – again.

Conservatives have sought to eliminate the NEA since the Reagan administration. In the past, arguments were limited to the content of specific state-sponsored works that were deemed offensive or immoral – an offshoot of the culture wars.

Now the cuts are largely driven by an ideology to shrink the federal government and decentralize power. The Heritage Foundation, a conservative think tank, argues that government should not use its “coercive power of taxation” to fund arts and humanities programs that are neither “necessary nor prudent.” The federal government, in other words, has no business supporting culture. Period.

But there are two major flaws in conservatives’ latest attack on the NEA: The aim to decentralize the government could end up dealing local communities a major blow, and it ignores the economic contribution of this tiny line item expense.

The relationship between government and the arts

Historically, the relationship between the state and culture is as fundamental as the idea of the state itself. The West, in particular, has witnessed an evolution from royal and religious patronage of the arts to a diverse range of arts funding that includes sales, private donors, foundations, corporations, endowments and the government.

Prior to the formation of the NEA in 1965, the federal government strategically funded cultural projects of national interest. For example, the Commerce Department subsidized the film industry in the 1920s and helped Walt Disney skirt bankruptcy during World War II. The same could be said for the broad range of New Deal economic relief programs, like the Public Works of Art Project and the Works Progress Administration, which employed artists and cultural workers. The CIA even joined in, funding Abstract Expressionist artists as a cultural counterweight to Soviet Realism during the Cold War.

The NEA came about during the Cold War. In 1963, President John F. Kennedy asserted the political and ideological importance of artists as critical thinkers, provocateurs and powerful contributors to the strength of a democratic society. His attitude was part of a broader bipartisan movement to form a national entity to promote American arts and culture at home and abroad. By 1965, President Johnson took up Kennedy’s legacy, signing the National Arts and Cultural Development Act of 1964 – which established the National Council on the Arts – and the National Foundation on the Arts and Humanities Act of 1965, which established the NEA.

Since its inception, the NEA has weathered criticism from the left and right. The right generally argues state funding for culture shouldn’t be the government’s business, while some on the left have expressed concern about how the funding might come with constraints on creative freedoms. Despite complaints from both sides, the United States has never had a fully articulated, coherent national policy on culture, unless – as historian Michael Kammen suggests – deciding not to have one is, in fact, policy.

Flare-ups in the culture wars

Targeting of the NEA has had more to do with the kind of art the government funded than any discernible impact to the budget. The amount in question – roughly US$148 million – is a drop in the morass of a $3.9 trillion federal budget.

Instead, the arts were a focus of the culture wars that erupted in the 1980s, which often invoked legislative grandstanding for elimination of the NEA. Hot-button NEA-funded pieces included Andre Serrano’s “Immersion (Piss Christ)” (1987), Robert Mapplethorpe’s photo exhibit “The Perfect Moment” (1989) and the case of the “NEA Four,” which involved the rejection of NEA grant applicants by performance artists Karen Finley, Tim Miller, John Fleck and Holly Hughes.

In each case, conservative legislators isolated an artist’s work – connected to NEA funding – that was objectionable due to its sexual or controversial content, such as Serrano’s use of Christian iconography. These artists’ works, then, were used to stoke a public debate about normative values. Artists were the targets, but often museum staff and curators bore the brunt of these assaults. The NEA four were significant because the artists had grants unlawfully rejected based upon standards of decency that were eventually deemed unconstitutional by the Supreme Court in 1998.

As recently as 2011, former Congressmen John Boehner and Eric Cantor targeted the inclusion of David Wojnarowicz’s “A Fire in My Belly, A Work in Progress” (1986-87) in a Smithsonian exhibition to renew calls to eliminate the NEA.

In all these cases, the NEA had funded artists who either brought attention to the AIDS crisis (Wojnarowicz), invoked religious freedoms (Serrano) or explored feminist and LGBTQ issues (Mapplethorpe and the four performance artists). Controversial artists push the boundaries of what art does, not just what art is; in these cases, the artists were able to powerfully communicate social and political issues that elicited the particular ire of conservatives.

A local impact

But today, it’s not about the art itself. It’s about limiting the scope and size of the federal government. And that ideological push presents real threats to our economy and our communities.

Organizations like the Heritage Foundation fail to take into account that eliminating the NEA actually causes the collapse of a vast network of regionally controlled, state-level arts agencies and local councils. In other words, they won’t simply be defunding a centralized bureaucracy that dictates elite culture from the sequestered halls of Washington, D.C. The NEA is required by law to distribute 40 percent of its budget to arts agencies in all 50 states and six U.S. jurisdictions.

Many communities – such as Princeton, New Jersey, which could lose funding to local cultural institutions like the McCarter Theatre – are anxious about how threats to the NEA will affect their community.

Therein lies the misguided logic of the argument for defunding: It targets the NEA but in effect threatens funding for programs like the Creede Repertory Theatre – which serves rural and underserved communities in states like Colorado, New Mexico, Utah, Oklahoma and Arizona – and Appalshop, a community radio station and media center that creates public art installations and multimedia tours in Jenkins, Kentucky to celebrate Appalachian cultural identity.

While the present administration and the conservative movement claim they’re simply trying to save taxpayer dollars, they also ignore the significant economic impacts of the arts. The Bureau of Economic Analysis reported that the arts and culture industry generated $704.8 billion of economic activity in 2013 and employed nearly five million people. For every dollar of NEA funding, there are seven dollars of funding from other private and public funds. Elimination of the agency endangers this economic vitality.

Ultimately, the Trump administration needs to decide whether artistic and cultural work is important to a thriving economy and democracy.

The Conversation

Aaron D. Knochel, Assistant Professor of Art Education, Pennsylvania State University

http://www.salon.com/2017/02/08/why-do-conservatives-want-the-government-to-defund-the-arts_partner/?source=newsletter

2017 Isn’t ‘1984’—It’s Stranger Than Orwell Imagined

NEWS & POLITICS
Orwell could not have imagined the internet and its role in distributing alternative facts.

Photo Credit: Jason Ilagan / Flickr

A week after President Donald Trump’s inauguration, George Orwell’s “1984” is the best-selling book on Amazon.com.

The hearts of a thousand English teachers must be warmed as people flock to a novel published in 1949 for ways to think about their present moment.

Orwell set his story in Oceania, one of three blocs or mega-states fighting over the globe in 1984. There has been a nuclear exchange, and the blocs seem to have agreed to perpetual conventional war, probably because constant warfare serves their shared interests in domestic control.

Oceania demands total subservience. It is a police state, with helicopters monitoring people’s activities, even watching through their windows. But Orwell emphasizes it is the “ThinkPol,” the Thought Police, who really monitor the “Proles,” the lowest 85 percent of the population outside the party elite. The ThinkPol move invisibly among society seeking out, even encouraging, thoughtcrimes so they can make the perpetrators disappear for reprogramming.

The other main way the party elite, symbolized in the mustached figurehead Big Brother, encourage and police correct thought is through the technology of the Telescreen. These “metal plaques” transmit things like frightening video of enemy armies and of course the wisdom of Big Brother. But the Telescreen can see you, too. During mandatory morning exercise, the Telescreen not only shows a young, wiry trainer leading cardio, it can see if you are keeping up. Telescreens are everywhere: They are in every room of people’s homes. At the office, people use them to do their jobs.

The story revolves around Winston Smith and Julia, who try to resist their government’s overwhelming control over facts. Their act of rebellion? Trying to discover “unofficial” truth about the past, and recording unauthorized information in a diary. Winston works at the colossal Ministry of Truth, on which is emblazoned IGNORANCE IS STRENGTH. His job is to erase politically inconvenient data from the public record. A party member falls out of favor? She never existed. Big Brother made a promise he could not fulfill? It never happened.

Because his job calls on him to research old newspapers and other records for the facts he has to “unfact,” Winston is especially adept at “doublethink.” Winston calls it being “conscious of complete truthfulness while telling carefully constructed lies… consciously to induce unconsciousness.”

Oceania: The product of Orwell’s experience

Orwell’s setting in “1984” is inspired by the way he foresaw the Cold War – a phrase he coined in 1945 – playing out. He wrote it just a few years after watching Roosevelt, Churchill and Stalin carve up the world at the Tehran and Yalta conferences. The book is remarkably prescient about aspects of the Stalinist Soviet Union, East Germany and Maoist China.

Orwell was a socialist. “1984” in part describes his fear that the democratic socialism in which he believed would be hijacked by authoritarian Stalinism. The novel grew out of his sharp observations of his world and the fact that Stalinists tried to kill him.

In 1936, a fascist-supported military coup threatened the democratically elected socialist majority in Spain. Orwell and other committed socialists from around the world, including Ernest Hemingway, volunteered to fight against the rightist rebels. Meanwhile, Hitler lent the rightists his air power while Stalin tried to take over the leftist Republican resistance. When Orwell and other volunteers defied these Stalinists, they moved to crush the opposition. Hunted, Orwell and his wife had to flee for their lives from Spain in 1937.

George Orwell at the BBC.

Back in London during World War II, Orwell saw for himself how a liberal democracy and individuals committed to freedom could find themselves on a path toward Big Brother. He worked for the BBC writing what can only be described as “propaganda” aimed at an Indian audience. What he wrote was not exactly doublethink, but it was news and commentary with a slant to serve a political purpose. Orwell sought to convince Indians that their sons and resources were serving the greater good in the war. Having written things he believed were untrue, he quit the job after two years, disgusted with himself.

Imperialism itself disgusted him. As a young man in the 1920s, Orwell had served as a colonial police officer in Burma. In a distant foreshadowing of Big Brother’s world, Orwell reviled the arbitrary and brutish role he took on in a colonial system. “I hated it bitterly,” he wrote. “In a job like that you see the dirty work of Empire at close quarters. The wretched prisoners huddling in the stinking cages of the lock-ups, the gray, cowed faces of the long-term convicts…”

Oceania was a prescient product of a particular biography and particular moment when the Cold War was beginning. Naturally, then, today’s world of “alternative facts” is quite different in ways that Orwell could not have imagined.

Big Brother not required

Orwell described a single-party system in which a tiny core of oligarchs, Oceania’s “inner party,” control all information. This is their chief means of controlling power. In the U.S. today, information is wide open to those who can access the internet, at least 84 percent of Americans. And while the U.S. arguably might be an oligarchy, power exists somewhere in a scrum including the electorate, constitution, the courts, bureaucracies and, inevitably, money. In other words, unlike in Oceania, both information and power are diffuse in 2017 America.

Those who study the decline in standards of evidence and reasoning in the U.S. electorate chiefly blame politicians’ concerted efforts from the 1970s to discredit expertise, degrade trust in Congress and its members, even question the legitimacy of government itself. With those leaders, institutions and expertise delegitimized, the strategy has been to replace them with alternative authorities and realities.

In 2004, a senior White House adviser suggested a reporter belonged to the “reality-based community,” a sort of quaint minority of people who “believe that solutions emerge from your judicious study of discernible reality.… That’s not the way the world really works anymore.”

Orwell could not have imagined the internet and its role in distributing alternative facts, nor that people would carry around Telescreens in their pockets in the form of smartphones. There is no Ministry of Truth distributing and policing information, and in a way everyone is Big Brother.

It seems less a situation that people are incapable of seeing through Big Brother’s big lies, than they embrace “alternative facts.” Some researchers have found that when some people begin with a certain worldview – for example, that scientific experts and public officials are untrustworthy – they believe their misperceptions more strongly when given accurate conflicting information. In other words, arguing with facts can backfire. Having already decided what is more essentially true than the facts reported by experts or journalists, they seek confirmation in alternative facts and distribute them themselves via Facebook, no Big Brother required.

In Orwell’s Oceania, there is no freedom to speak facts except those that are official. In 2017 America, at least among many of the powerful minority who selected its president, the more official the fact, the more dubious. For Winston, “Freedom is the freedom to say that two plus two make four.” For this powerful minority, freedom is the freedom to say two plus two make five.

The ConversationThis article was originally published on The Conversation. Read the original article.

50 Years Later, Here Are 3 Big Ways the Summer of Love Is Still with Us

CULTURE
The ideals of the Human Be-In remain woven through American culture.

Members of Jefferson Airplane performing at the KFRC Fantasy Fair and Magic Mountain Music Festival in Marin County, California, United States in June, 1967
Photo Credit: Bryan Costales ©2009 Bryan Costales, licensed CC BY-SA 3.0-Bcx.Org: http://www.bcx.org/photos/events/concerts/ffair/?file=KFRCFantasyFair19670603_7464SBCX.jpg, Wikimedia Commons CC BY-SA 3.0; Jefferson Airplane, Marin County, CA, 1967

Born of the simple intention to unite people in the name of connection and love, an event on the polo fields of Golden Gate Park half a century ago sparked a cultural paradigm shift unrivaled in the U.S. since World War II. But this time it was the antithesis to war that would reshape America: the Summer of Love.

The impetus for that fateful summer was called the Human Be-In, in a nod to the peaceful sit-ins waged by university students in the preceding years against racial segregation. In the years surrounding the Summer of Love, the frigid prospect of nuclear war loomed, minorities and women were rising up against myriad oppressions and the government was cracking down on mind-altering substances like LSD and cannabis. The Summer of Love and its values of free expression, love, peace, activism, and psychedelic exploration of consciousness were the backlash.

The early acid-rock sounds of Grateful Dead, Jefferson Airplane, Big Brother and the Holding Co. and others mixed with the words of boundary-pushing poets and psychedelic pioneers to gather 75,000 or so young people in the park. They spilled out into the five-block radius of the Haight-Ashbury neighborhood with fresh smells, sounds and ideals that came to shape the era’s iconography.

Bill McCarthy, founder of the Unity Foundation, co-produced a 50-year anniversary celebration of the Be-In in San Francisco this week.

“It’s important that we celebrate the past, celebrate the victories, triumphs and challenges of the past, but at the same time look at what’s happening today,” he said. “We’re saying yes, in 1967 this all happened, so let’s rededicate ourselves to that. But let’s also see what’s happening today that can build community, build empathy with people all over the world that are struggling.”

He said given the current political climate, with Trump’s impending inauguration and all that’s bound to come with it, there is more reason than ever to “activate ourselves.” He said when you take the “long view” from 1967 to now, it’s obvious that we’re moving forward.

“The values we treasure and movements we created are still stronger than they ever have been,” he said. “When there’s darkness in the world, the thing that feeds darkness is fear. The last thing we should do right now is be fearful.”

Fifty years since the Be-In, as the digital age re-molds the economy, values and skylines of San Francisco and beyond, the ideals of the Human Be-In remain woven through our culture in ways we rarely pause to acknowledge. From the sounds of activism to the shape of companies to that box of free stuff out on the corner, many hippie dreams are alive and well in 2017.

Annie Oak, founder of the Women’s Visionary Congress, a nonprofit dedicated to exploring altered states of consciousness, says the prevalence of psychedelics in the 1960s and ’70s is directly related to the ideas put forth by young people at the time.

“These substances allowed people to think way outside the box and also question social systems,” she said. “The hippies here really put forward a liberal political consciousness and humanist values that impacted society.”

Here are three modern cultural shifts that have their roots in the psychedelic Summer of Love.

1. Collectivism, from communal living to open-source software. 

Annie Oak says communal living, which is everywhere now, was born in the Summer of Love. So, she says, are collectivist projects like the Haight-Ashbury Free Clinic, which is still in operation, offering medical treatment free of charge.

“These ideas of collectivism really launched larger ideas like the open-source software movement and creative commerce,” she notes. “These are ideas that are commonplace now.”

Michael Gosney has produced Digital Be-Ins over the years at Be-In anniversaries to pay homage to the initial Be-In of ’67 and to look to the future. He was involved in early desktop publishing and digital media in San Francisco in the late ’80s. It was the dawn of personal computers, and his magazine was covering early Macintosh creativity. He describes the publication as a “nexus of artists and tech people coming together.”

Between ’85 and ’92 he observed that psychedelics—which made their debut in modern culture during the Summer of Love—heavily influenced the creation of digital media. He says the software programmers who worked on digital music, animation, photography and video were influenced by psychedelics.

“I noticed the preponderance of psychedelic influence in the programming community with the engineers that were inventing these new tools,” he said. “Psychedelic influence was extremely powerful, and really that’s how people were seeing the vision of digital networks and so forth. It very much came out of the influence of psychedelics.”

2. Activism and alternative media.

The mainstream newspapers in 1967 were not about to promote the Be-In event. An underground, independent zine called the Oracle, produced for free in Haight-Ashbury, was the first to cover what would become the catalyst for the hippie days and cultural revolution.

“The Oracle was the first to write about the Be-In, so it helped launch the alternative press,” Annie Oak of WVC says. “And there were also underground radio stations that helped promote the events, so the whole alternative media movement really was moved along by the Be-In and the Summer of Love.”

Oak notes that the environmental movement was also taking place in Haight-Ashbury at the time. The local community organized in the ’60s against a proposed freeway project that would run through the panhandle portion of Golden Gate park, connecting Golden Gate Bridge with the Peninsula. The community organized in protest on the same polo grounds where the initial Be-In took place, and their uprising eventually killed the freeway project. This was in 1964, but Oak says the power of community organizing was a key motif of the ’67 Be-In and its cultural imprints.

“The freeway was one of the important predecessors of the Be-In activism and gathering that took place also in the polo grounds three years later, and the later protests against the war,” she said. “Timothy Leary kind of set the tone with his famous phrase, turn on, tune in, drop out, which kind of set the tone for the Be-In. But what really happened here is people kind of turned on to activism, and then took over. They took over big sections of our culture and changed it in positive ways.”

Oak notes the irony that because of the proposed freeway project, which would have displaced many residents, the Haight-Ashbury neighborhood harbored lower-income residents like students and minorities. As the years passed following the Summer of Love, the neighborhood became an iconic tourist destination. Today, as wealthy techies have been drawn to the city for its iconic allure, lower-income residents are priced out.

“Haight-Ashbury sort of personified the transition between the beat generation—the poets and jazz hipsters that were embracing a lot of the black jazz culture—and the hippies, who then kind of came into what was then a black neighborhood,” Oak says. “And, to some degree, later that movement ironically gentrified the neighborhood, and a lot of the black community then left. It was a very complex form of gentrification, and that gentrification is still happening.”

Bill McCarthy of Unity Foundation said in planning the Be-In anniversary this year he had a conversation with author and historian Dennis McNally about how the mainstream media of the time co-opted the Summer of Love.

“[McNally] was saying… the media created the hippie and created this—how we should look at the culture, and that was part of the downfall,” McCarthy said. “And to that I said, well, Dennis, the beautiful thing now is we can create our own media. We’re not saddled by ABC, NBC, CBS, whatever anymore. We have our own media vehicles.”

3. Cannabis legalization and psychedelic science are influencing mainstream medicine.

Two years prior to the Summer of Love, the psychedelic beloved by many young people who associated LSD with spiritual enlightenment and creative expression was criminalized, like cannabis before it. Retaliating against the Summer of Love and the progressive concepts it launched, President Richard Nixon waged the racist, violent (and ultimately failed) war on drugs that vilified psychedelics and cannabis in the public eye for decades.

Cannabis and most psychedelics remain federally illegal to this day, though the pendulum is starting to swing back. Eight U.S. states have legalized weed for adult use, and this decade the first U.S. government-approved human trials assessing psychedelics in tandem with psychotherapy treatment are showing overwhelmingly positive results. Most of the studies are sponsored by the Multidisciplinary Association for Psychedelic Studies (MAPS), a nonprofit group founded by Rick Doblin in 1986.

Doblin said the Summer of Love set society on a path toward important cultural shifts.

“Since the iconic Summer of Love, 50 years ago, marijuana has gone from being a heavily demonized drug used by rebellious youth to a medicine, with one of the largest growing demographics being elderly people,” he said. “Psychedelics now are being investigated as tools used in scientific research for therapeutic uses, a catalyst of spirituality, art and creativity, acceptance of death and we are now facing their legitimization and acceptance as medical tools.”

In addition, MAPS is conducting studies of MDMA’s potential to help treat post-traumatic stress disorder, researching the use of ibogaine for opiate addiction and “implementing ayahuasca research for PTSD and broadening psychedelic harm reduction outreach for more widespread acceptance into our culture,” Doblin said. Similar to the path of cannabis in culture, he predicts psychedelics will first be accepted medicinally, then for their broadened spiritual and cultural uses.

“One day people will take for granted that psychedelics are legal, are highly prized, and help people make positive contributions to society,” he said.

April M. Short is a yoga teacher and writer who previously worked as AlterNet’s drugs and health editor. She currently works part-time for AlterNet, and freelances for a number of publications nationwide. 

http://www.alternet.org/culture/50-years-later-here-are-3-big-ways-summer-love-still-us?akid=15118.265072.82O0Sv&rd=1&src=newsletter1070698&t=14

Bad times make great art?

 Worlds of light and shadow: The reproduction of liberalism in Weimar Germany

The claim that good art comes from hard times is the height of delusionally entitled thinking

Bad times make great art. Worlds of light and shadow: The reproduction of liberalism in Weimar Germany

Fritz Lang’s “Metropolis” (1927) (Credit: Kino International)

On election night a murmur started just as the last gasp faded, “Well at least we can expect some great art.” At first the sentiment was a fatalistic one-off, a brave face, a shy hope that something good would come from the dark days forecast for the Trump presidency. It didn’t take long for the statement to acquire a predictive tone, eventually a waft of desperation was detectable and, ultimately, shrill fiat.

The art of protest is provocative, no question. It’s often brave, usually fierce, sometimes compelling and occasionally inspirational. But is the appeal of the books, films, poetry, painting, television and sculpture produced in response to tyranny, oligarchic pomposity or a fetishistic prioritization of the bottom line universal or simply reactive? How durable is the art birthed from protest? The following essay is the second in a series for Salon exploring the question Do bad times really inspire great art?

On Nov. 6 of this year, just two days before the presidential election, aging American punks Green Day took the stage at the MTV Europe Music Awards to perform their 2004, Bush II-era modern pop-punk staple, “American Idiot.”

Singer Billie Joe Armstrong snarled in the vague direction of then-presidential hopeful, now president-elect Donald J. Trump, asking the audience of largely Dutch citizens possessing close to zero influence on the American political conversation, “Can you hear the sound of hysteria? The subliminal mind-Trump America.”

Apart from the lyrics not making a lot of sense, it also had no effect whatsoever on the outcome of the election. However well-meaning, Armstrong and Co. would have been just as effective by writing “DO NOT VOTE FOR DONALD TRUMP” on a piece of paper, cramming it in a bottle, and chucking it into the ocean, or by whispering “Trump is bad” into a hole.

The clear lesson: punk is dead. And not only that, but it’s been poisoned, drowned, hanged, beaten, stabbed, killed, re-killed and killed again, like some slobbering Rasputin-ish zombie. So when people claim, desperately, that Trump’s America will somehow lead to a resurgence in angry, politically charged guitar music, it’s all I can do to keep my eyes from rolling out of my head.

* * *

To claim that good art — that is: stuff of considerable aesthetic merit, which is maybe even socially advantageous — comes from hard times is the height of delusionally entitled thinking, as if mass deportations and radicalized violence are all in the service of a piece of music. Of course, even the idea of what qualifies as “good times” must be qualified. Given that Trump won the election, it stands to reason that for a majority of Americans (or at least for a majority of electoral college representatives) the prospect of a Trump presidency is a beneficial thing, which will usher in a new epoch of prosperity and big-league American greatness.

There may be truth, or at least the ring of truth, in the idea that objects of artistic value can be produced under the pressure of hardship. While it may be true that an artist like, say, the late Leonard Cohen was able to mine the fathomless quarries of heartache and longing for his music and poetry, it is also true that Cohen was blessed with socio-economic privilege, both in the form of family inheritances and grants from a liberal Canadian government that supported (and continues to support, in various respects) art and artists. His heart may have been hard, but the times weren’t.

At the cultural level, good art tends to emerge from good times. It’s not even about having a well-managed social welfare state (though that, of course, helps). Rather, it seems to be a matter of liberal attitudes reproducing themselves in certain contexts, leading to greater degrees of freedom and greater gains in artistic production and sophistication.

So forget Green Day for a second. Take, as an example, the Weimar Republic of Germany’s interwar period. It was a short-lived heyday of liberalism and representative democracy, flourishing smack between two periods of staunch authoritarianism: bookended by the post-unification German Empire on one side, and Nazi Germany on the other. It was in this context that some of the twentieth century’s most compelling art was created.

* * *

It’s tricky to even think about Weimar Germany without being ensnared by the sickly succour of cliché. You know: leggy chorus girls high-kicking in all-night cabarets, gays and lesbians fraternizing freely, women in short hair lighting cigarettes while the zippy strains of jaunty jazz wafts hither and yon on in a smoky hall — a populace caught in full thrall of freedom. Fritz Lang’s 1922 film “Dr. Mabuse, The Gambler,” the opening titles of which describe it as “A Picture of the Times,” depicts Berlin’s underworld as equally rococo in its bourgeois elegance, and chaotically debased. As the proprietor of an illegal casino puts it, summing up the free-spirited ethos of the era, “Everything that pleases is allowed.”
Emerging from the horror of the First World War, and the 1918 November Revolution that saw the imperial government sacked, the nation’s consciousness was in a state of jumble and disarray. But it was an exciting  jumble, full of possibility. The philosopher Ernst Block compared Weimar Germany to Periclean Athens of the fifth century BCE: a time of cultural thriving, sovereign self-governance, and increased social and political equality. Germany became a hub for intellectualism, nurturing physicists like Einstein and the critical theorists of the Frankfurt School. Art indulged experimentalism and the avant-garde, united less by common aesthetic tendencies and more by shared socialist values. It was era of Otto Dix, Bertolt Brecht, the Bauhaus group, Arnold Schoenberg and a new, expressionist tendency in cinema.

Robert Weine’s 1919 film “The Cabinet of Dr. Caligari” embodied the spirit of this new age. It told the story of a small community preyed upon by the maniacal carnival barker Dr. Caligari (Werner Krauss), whose newest attraction is a spooky-looking sleepwalker named Cesare (the great German actor Conrad Veidt). By cover of darkness, Caligari controls Cesare, using him to commit a string of violent crimes. With its highly stylized sets, and comments on the brutality of authority, the film presented a whole alternative vision of the world. Both stylistically and thematically, “Caligari” imagined the splintering of the postwar German psyche, presenting a sense that reality itself had been destabilizing, and was reconstituting itself in jagged lines and oblique curlicues. The movie’s lasting influence is inestimable.

In his landmark work of cultural analysis, “From Caligari to Hitler: A Psychological History of the German Film,” film critic Siegfried Kracauer described the “collapse of the old hierarchy of values and conventions” in Weimar-era Germany. “For a brief while,” Kracauer writes, “the German mind had a unique opportunity to overcome hereditary habits and reorganize itself completely. It enjoyed freedom of choice, and the air was full of doctrines trying to captivate it, to lure it into a regrouping of inner attitudes.”

Certainly, German cinema of the era often explicitly figures authoritarian characters attempting to seduce the public: from Weine’s madman Dr. Caligari, to Lang’s huckster Dr. Mabuse. For the reforming national consciousness, authority served as a kind of siren song, luring the public out of the rowdy cabarets and nightclubs and back on the straight and narrow. By the early 1930s, attitudes seemed to be shifting. In Fritz Lang’s classic thriller “M,” from 1931, police sniff out a serial killer in part by trying to determine a psychosexual basis for his crimes. It was at once a strike against the unfettered sexual libertinism of the Berlin cabarets, and a sinister intimation of Nazism, which was notoriously marked by its pseudoscientific quackery about the biological basis of criminality and depravity. The hallmarks of Weimar — its authoritarian disenthrallment, its slackening attitudes toward sexual repression, its intoxicating cosmopolitanism — were curdling.

* * *

Weimar poses a number of compelling questions around the subject of historical and cultural Golden Ages. Such rigidly compartmentalized, epochal thinking leads inevitably to collapse. How, after all, can a “Golden Age” be defined without presuming its emergence from, and collapse back into, periods of relative darkness and doom? It recalls Karl Marx’s thinking on historical stages, outlined in volume one of “Capital,” and the idea that each historical period carries within it the seeds of its successor. And it is force, according to Marx, that serves as “the midwife of every old society pregnant with a new one.”

In the case of Weimar, the sense of expanded liberty was undercut in several respects. While the upper and middle classes grew in prosperity, the working poor were afflicted by hyperinflation, and by and large unaffected by new gains made in left-wing modernist painting, cabaret culture and avant-garde cinema. Sexual libertinism bred syphilis outbreaks. Old-stock Germans balked at the moral and aesthetic degeneracy of the new art movements. For such people, Weimar was regarded less like Periclean Athens and more like the ancient African port of Carthage: fit to be sacked, razed, and have its earth salted so that no memory of it could possibly proliferate.

It speaks to a certain historical tendency. To revise Marx, it’s not just that a given society is pregnant with the next one, but that it’s pregnant with resentments and reactions. With Weimar, expanded cultural and political liberalism emerged as a reaction to the authoritarianism of imperial Germany, with the even fiercer authoritarianism and violence of Hitler’s regime emerging as a response to that. Stereotypes of left-leaning artists cavorting in cabarets found their negative image, their doppelgänger, in nationalist thugs roving the streets.

This is not to say that it wasn’t a period of growth and advancement, artistically and otherwise. Rather, it’s a historical reminder that even periods that usher in all manner of artistic and cultural headway need to be relentlessly qualified. It’s not that good times don’t make for good art. It’s that, really, there’s never been such a thing as a distinctly, determinedly, wholly unequivocally “good time.” Even the most shimmering epochs exist in contradiction, conflict and often out-and-out hypocrisy. Like the backdrop of “Caligari,” ours has always been a world of light and shadow. Something to keep in mind as the world stumbles into what’s shaping up to be a new Periclean Golden Age of American Idiocy.

John Semley lives and works in Toronto. He is a books columnist at the Globe & Mail newspaper and the author of “This Is A Book About The Kids In The Hall” (ECW Press).