Why the Rich Love Burning Man

Burning Man became a festival that rich libertarians love because it never had a radical critique at its core.

burning-man

In principle the annual Burning Man festival sounds a bit like a socialist utopia: bring thousands of people to an empty desert to create an alternative society. Ban money and advertisements and make it a gift economy. Encourage members to bring the necessary ingredients of this new world with them, according to their ability.

Introduce “radical inclusion,” “radical self-expression,” and “decommodification” as tenets, and designate the alternative society as a free space, where sex and gender boundaries are fluid and meant to be transgressed.

These ideas — the essence of Burning Man — are certainly appealing.

Yet capitalists also unironically love Burning Man, and to anyone who has followed the recent history of Burning Man, the idea that it is at all anticapitalist seems absurd: last year, a venture capitalist billionaire threw a $16,500-per-head party at the festival, his camp a hyper-exclusive affair replete with wristbands and models flown in to keep the guests company.

Burning Man is earning a reputation as a “networking event” among Silicon Valley techies, and tech magazines now send reporters to cover it. CEOs like Mark Zuckerberg of Facebook and Larry Page of Alphabet are foaming fans, along with conservative anti-tax icon Grover Norquist and many writers of the libertarian (and Koch-funded) Reason magazine. Tesla CEO Elon Musk even went so far as to claim that Burning Man “is Silicon Valley.”

Radical Self-Expression

The weeklong Burning Man festival takes place once a year over Labor Day weekend in a remote alkali flat in northwestern Nevada. Two hours north of Reno, the inhospitable Black Rock Desert seems a poor place to create a temporary sixty-thousand-person city — and yet that’s entirely the point. On the desert playa, an alien world is created and then dismantled within the span of a month. The festival culminates with the deliberate burning of a symbolic effigy, the titular “man,” a wooden sculpture around a hundred feet tall.

Burning Man grew from unpretentious origins: a group of artists and hippies came together to burn an effigy at Baker Beach in San Francisco, and in 1990 set out to have the same festival in a place where the cops wouldn’t hassle them about unlicensed pyrotechnics. The search led them to the Black Rock Desert.

Burning Man is very much a descendent of the counterculture San Francisco of yesteryear, and possesses the same sort of libertine, nudity-positive spirit. Some of the early organizers of the festival professed particular admiration for the Situationists, the group of French leftists whose manifestos and graffitied slogans like “Never Work” became icons of the May 1968 upsurge in France.

Though the Situationists were always a bit ideologically opaque, one of their core beliefs was that cities had become oppressive slabs of consumption and labor, and needed to be reimagined as places of play and revolt. Hence, much of their art involved cutting up and reassembling maps, and consuming intoxicants while wandering about in Paris.

You can feel traces of the Situationists when walking through Black Rock City, Burning Man’s ephemeral village. Though Black Rock City resembles a city in some sense, with a circular dirt street grid oriented around the “man” sculpture, in another sense it is completely surreal: people walk half-naked in furs and glitter, art cars shaped like ships or dragons pump house music as they purr down the street.

Like a real city, Burning Man has bars, restaurants, clubs, and theaters, but they are all brought by participants because everyone is required to “bring something”:

The people who attend Burning Man are no mere “attendees,” but rather active participants in every sense of the word: they create the city, the interaction, the art, the performance and ultimately the “experience.” Participation is at the very core of Burning Man.

Participation sounds egalitarian, but it leads to some interesting contradictions. The most elaborate camps and spectacles tend to be brought by the rich because they have the time, the money, or both, to do so. Wealthier attendees often pay laborers to build and plan their own massive (and often exclusive) camps. If you scan San Francisco’s Craigslist in the month of August, you’ll start to see ads for part-time service labor gigs to plump the metaphorical pillows of wealthy Burners.

The rich also hire sherpas to guide them around the festival and wait on them at the camp. Some burners derogatorily refer to these rich person camps as “turnkey camps.

Silicon Valley’s adoration of Burning Man goes back a long way, and tech workers have always been fans of the festival. But it hasn’t always been the provenance of billionaires — in the early days, it was a free festival with a cluster of pitched tents, weird art, and explosives; but as the years went on, more exclusive, turnkey camps appeared and increased in step with the ticket price — which went from $35 in 1994 to $390 in 2015 (about sixteen times the rate of inflation).

Black Rock City has had its own FAA-licensed airport since 2000, and it’s been getting much busier. These days you can even get from San Carlos in Silicon Valley to the festival for $1500. In 2012, Mark Zuckerberg flew into Burning Man on a private helicopter, staying for just one day, to eat and serve artisanal grilled cheese sandwiches. From the New York Times:

“We used to have R.V.s and precooked meals,” said a man who attends Burning Man with a group of Silicon Valley entrepreneurs. (He asked not to be named so as not to jeopardize those relationships.) “Now, we have the craziest chefs in the world and people who build yurts for us that have beds and air-conditioning.” He added with a sense of amazement, “Yes, air-conditioning in the middle of the desert!”

The growing presence of the elite in Burning Man is not just noticed by outsiders — long-time attendees grumble that Burning Man has become “gentrified.” Commenting on the New York Times piece, burners express dismay at attendees who do no work. “Paying people to come and take care of you and build for you . . . and clean up after you . . . those people missed the point.”

Many Burners seethed after reading one woman’s first-person account of how she was exploited while working at the $17,000-per-head camp of venture capitalist Jim Tananbaum. In her account, she documented the many ways in which Tananbaum violated the principles of the festival, maintaining “VIP status” by making events and art cars private and flipping out on one of his hired artists.

Tananbaum’s workers were paid a flat $180 a day with no overtime, but the anonymous whistleblower attests that she and others worked fifteen- to twenty-hour days during the festival.

The emergent class divides of Burning Man attendees is borne out by data: the Burning Man census (yes, they have a census, just like a real nation-state) showed that from 2010 to 2014, the number of attendees who make more than $300,000 a year doubled from 1.4% to 2.7%. This number is especially significant given the outsize presence 1 percenters command at Burning Man.

In a just, democratic society, everyone has equal voice. At Burning Man everyone is invited to participate, but the people who have the most money decide what kind of society Burning Man will be — they commission artists of their choice and build to their own whims. They also determine how generous they are feeling, and whether to withhold money.

It might seem silly to quibble over the lack of democracy in the “governance” of Black Rock City. After all, why should we care whether Jeff Bezos has commissioned a giant metal unicorn or a giant metal pirate ship, or whether Tananbaum wants to spend $2 million on an air-conditioned camp? But the principles of these tech scions — that societies are created through charity, and that the true “world-builders” are the rich and privileged — don’t just play out in the Burning Man fantasy world. They carry over into the real world, often with less-than-positive results.

Remember when Facebook CEO Mark Zuckerberg decided to help “fix” Newark’s public schools? In 2010, Zuckerberg — perhaps hoping to improve his image after his callous depiction in biopic The Social Network donated $100 million to Newark’s education system to overhaul Newark schools.

The money was directed as a part of then–Newark Mayor Cory Booker’s plan to remake the city into the “charter school capital of the nation,” bypassing public oversight through partnership with private philanthropists.

Traditionally, public education has been interwoven with the democratic process: in a given school district, the community elects the school board every few years. School boards then make public decisions and deliberations. Zuckerberg’s donation, and the project it was attached to, directly undermined this democratic process by promoting an agenda to privatize public schools, destroy local unions, disempower teachers, and put the reins of public education into the hands of technocrats and profiteers.

This might seem like an unrelated tangent — after all, Burning Man is supposed to be a fun, liberating world all its own. But it isn’t. The top-down, do what you want, radically express yourself and fuck everyone else worldview is precisely why Burning Man is so appealing to the Silicon Valley technocratic scions.

To these young tech workers — mostly white, mostly men — who flock to the festival, Burning Man reinforces and fosters the idea that they can remake the world without anyone else’s input. It’s a rabid libertarian fantasy. It fluffs their egos and tells them that they have the power and right to make society for all of us, to determine how things should be.

This is the dark heart of Burning Man, the reason that high-powered capitalists — and especially capitalist libertarians — love Burning Man so much. It heralds their ideal world: one where vague notions of participation replace real democracy, and the only form of taxation is self-imposed charity. Recall Whole Foods CEO John Mackey’s op-ed, in the wake of the Obamacare announcement, in which he proposed a healthcare system reliant on “voluntary, tax-deductible donations.”

This is the dream of libertarians and the 1 percent, and it reifies itself at Burning Man — the lower caste of Burners who want to partake in the festival are dependent on the whims and fantasies of the wealthy to create Black Rock City.

Burning Man foreshadows a future social model that is particularly appealing to the wealthy: a libertarian oligarchy, where people of all classes and identities coexist, yet social welfare and the commons exist solely on a charitable basis.

Of course, the wealthy can afford more, both in lodging and in what they “bring” to the table: so at Burning Man, those with more money, who can bring more in terms of participation, labor and charity, are celebrated more.

It is a society that we find ourselves moving closer towards the other 358 (non–Burning Man) days of the year: with a decaying social welfare state, more and more public amenities exist only as the result of the hyper-wealthy donating them. But when the commons are donated by the wealthy, rather than guaranteed by membership in society, the democratic component of civic society is vastly diminished and placed in the hands of the elite few who gained their wealth by using their influence to cut taxes and gut the social welfare state in the first place.

It’s much like how in my former home of Pittsburgh, the library system is named for Andrew Carnegie, who donated a portion of the initial funds. But the donated money was not earned by Carnegie; it trickled up from his workers’ backs, many of them suffering from overwork and illness caused by his steel factories’ pollution. The real social cost of charitable giving is the forgotten labor that builds it and the destructive effects that flow from it.

At Burning Man the 1 percenters — who have earned their money in the same way that Carnegie did so long ago — show up with an army of service laborers, yet they take the credit for what they’ve “brought.”

Burning Man’s tagline and central principle is radical self-expression:

Radical self-expression arises from the unique gifts of the individual. No one other than the individual or a collaborating group can determine its content. It is offered as a gift to others. In this spirit, the giver should respect the rights and liberties of the recipient.

The root of Burning Man’s degeneration may lie in the concept itself. Indeed, the idea of radical self-expression is, at least under the constraints of capitalism, a right-wing, Randian ideal, and could easily be the core motto of any of the large social media companies in Silicon Valley, who profit from people investing unpaid labor into cultivating their digital representations.

It is in their interest that we are as self-interested as possible, since the more we obsess over our digital identity, the more personal information of ours they can mine and sell. Little wonder that the founders of these companies have found their home on the playa.

It doesn’t seem like Burning Man can ever be salvaged, or taken back from the rich power-brokers who’ve come to adore it and now populate its board of directors. It became a festival that rich libertarians love because it never had a radical critique at its core; and, without any semblance of democracy, it could easily be controlled by those with influence, power, and wealth.

Burning Man will be remembered more as the model for Google CEO Larry Page’s dream of a libertarian state, than as the revolutionary Situationist space that it could have been.

As such, it is a cautionary tale for radicals and utopianists. When “freedom” and “inclusion” are disconnected from democracy, they often lead to elitism and reinforcement of the status quo.

 

https://www.jacobinmag.com/2015/08/burning-man-one-percent-silicon-valley-tech/

Hiding in plain sight: the history of the War on Drugs

Hiding in plain sight: the history of the War on Drugs

By Paul Bermanszohn

On August 13, 2015

Post image for Hiding in plain sight: the history of the War on DrugsThe War on Drugs was a direct response to the African American uprisings of the 1960s. Its racist and repressive effects continue to be felt today.

Photo: A scene of the 1967 Newark Rebellion, by Don Hogan Charles.

Recent US history, from the 1960s until today, shows the War on Drugs to be a crusade of repression against African American people, incarcerating millions to prevent a renewal of the struggle for freedom.

We need to look at the whole picture of this drug war, not just a fragment or a piece of it. Most writers on this subject either get lost in the details or cannot see past the lie that the US is a “democracy.” In either case they often fail to see the realities of this history, even though the facts are clear. Presenting well-known events in chronological order clarifies the inner connection among these events and brings out their larger significance.

Indeed, placing the history in sequence makes it plain: the Great Migration brought on a Great Rebellion. A vindictive Great Repression was orchestrated to crush the Great Rebellion and prevent its continuation. Masked as the so-called “War on Drugs,” which has swept millions into prisons and jails across the US, the Great Repression has, in effect, punished generations for the “sins” of their ancestors — those who dared to rebel.

This repression is still underway today. Its effects are clearly racial. But, camouflaged as a “War on Drugs,” it has allowed the country’s rulers to appear “colorblind” or race-neutral — as if they are merely enforcing the law.

The Great Migration

In the early 20th century, fleeing the decaying Jim Crow system of agricultural labor in the fields and farms of the South, millions of African Americans moved out, seeking jobs in the military-industrial centers of the North, the mid West and the West. From World War I to the 1960s, millions migrated from virtual chattel slavery in the South to wage slavery in the North. They found little improvement.

Herded into old ghettos, or into quickly-created new ones, they found discrimination, barely habitable housing with a constant threat of dislocation by projects of urban renewal, or “Negro removal.” Giant housing projects, little more than stacks of shacks, were built to house the many migrants. Overcrowded and neglected schools provided poor or non-existent education for their children.

The misery was compounded by relentless police abuse. When Malcolm X spoke of “the so-called Negro out here catching hell,” he was talking about (and to) this group. Malcolm lived this experience and became the spokesman of urban ghetto dwellers. The desperation and outrage experienced by these migrants made explosion inevitable.

The Great Rebellion

Violent repression of civil rights demonstrators seeking basic respect combined with the migrants’ sufferings to ignite a series of mass urban uprisings across the US. These insurrections are generally seen as individual explosions, city by city, but to grasp their cumulative significance we need to see them as a single process: African Americans striving for freedom in racist America. The rebellion was at the heart of the ’60s and drives American politics to this day, even under the nation’s first black president.

These rebellions are generally dismissed as “riots” and their significance erased.

Kenneth Stahl titled his website and book on the Detroit Rebellion of 1967 The Great Rebellion, but I expand the use of this term to include all these uprisings. Virtually all were precipitated by violent police attacks or rumors of such attacks. Since officials often lie, it is impossible to know what exactly happened in every case, but at any rate a large number of uprisings took place across the country: over 300 cities rose up in the ‘60s, according to the best estimates.

The first insurrection, in New York City, was touched off by a police murder. The initial focus of the demonstration, called for by the Congress of Racial Equality (CORE), was the disappearance of three civil rights workers in Mississippi. However, when in the early morning of July 16, off-duty police Lieutenant Thomas Gilligan killed 15-year-old African American student James Powell, CORE decided to change the focus of their protest to police brutality in Harlem.

The protest was peaceful, but rage at the murder grew into a mass confrontation with police. Bands of looters operated in Harlem’s streets at night. Upheaval soon spread to Bedford Stuyvesant. After the New York City insurrection abated, like a series of aftershocks, smaller uprisings took place throughout the area, in upstate NY, NJ and Pennsylvania.

A year later, on August 11, unrest broke out in Watts, LA. Among the first targets of looters were gun stores — and they made full use of their weapons. For almost a week, people fought the police and army to a standstill. Black and white looters working together led King to state that “this was not a race riot. It was a class riot.” The Situationist International even treated the rebellion as a “revolutionary event,” with looting seen as a rejection of the commodity system, “the first step of a vast, all-embracing struggle.”

In 1966, there were 43 civil disturbances of varying intensity across the nation, including a notable uprising in Chicago, where the Puerto Rican community exploded into a week-long rebellion after a police shooting. On April 4, 1967, King delivered what is probably his most important speech: Beyond Vietnam: A Time to Break Silence. The relevance of this speech is often downplayed, and if mentioned at all, it tends to be portrayed as King’s speech opposing the US war in Vietnam. It was much more.

In the address, King embraced the world revolution saying, “if we are to get on to the right side of the world revolution, we as a nation must undergo a radical revolution of values. We must rapidly begin the shift from a thing-oriented society to a person-oriented society.” He called the US government “the greatest purveyor of violence in the world today” and called for an end to “the giant triplets of racism, materialism and economic exploitation.”

The speech galvanized the anti-war movement. Just eleven days later, on April 15, 1967, over 400,00 people marched to the UN to demand an end to the war. It was the first demonstration I ever attended. I vividly remember the excitement in the gathering place, Central Park’s Sheep Meadow, still packed with marchers, when word came that the front of the march, which filled the streets the whole way, had reached the UN over a mile away. The movement’s power continued to grow as the spirit of revolution spread.

In just a few years, the US military began to disintegrate. Eighty percent of soldiers were taking drugs. Combat refusals, naval mutinies and fragging incidents — soldiers shooting their officers — became widespread.

In 1967, over a hundred instances of violent upheaval were recorded. Most notably were the uprisings in Newark, were the violence was sparked by rumors of a black cab driver being killed by police after decades of housing discrimination and massive black unemployment, and the one in the Motor City, Detroit, where 43 people were killed after 12,000 soldiers descended upon the city in an attempt to quell the protests.

The Great Repression

The year ’68 proved to be the watershed. The Rebellion reached its peak and the initiative was seized by the forces of order, who subsequently organized the Great Repression. On April 4, 1968, Martin Luther King was killed, probably bygovernment assassination. His murder, one year to the day after his revolutionary speech, strikes some as a signal sent by the government to deter people from taking the revolutionary path. If this is so, it did not work. Following King’s murder the largest insurrection occurred. Over 100 cities exploded.

The Holy Week Uprising was the most serious bout of social upheaval in the United States since the Civil War. The largest insurrections took place inWashington, D.C., Baltimore, Louisville, Kansas City, and Chicago — with Baltimore experiencing the most significant political events. The Liberal Republican Governor of Maryland, Spiro T. Agnew, gathered African American community leaders and subjected them to a dressing down for not supporting the US government strongly enough. Seeking to divide and conquer, he said: “I call upon you to publicly repudiate, condemn and reject all black racists. This, so far, you have not been willing to do.”

Agnew’s speech received national headlines and led to his role in the presidential elections later that year, which centered on the urban uprisings of the preceding decade and created the miserable legacy of today. US politicians refined a coded language to conceal their racial motives. The Republican candidate Richard Nixon ran against the liberal Democrat Hubert Humphrey. The civil rights movement drove not only the KKK; it also drove overtly racist language underground. It did not end either.

The election centered on Nixon’s call for “law and order,” a slogan that meant a tough response to insurgents (called “rioters”) and the still popular notion that politicians should be “tough on crime.” Crime, disorder and violence became synonyms for being black.

Nixon eagerly stated to work on a war on drugs before his inauguration. Early in his presidency, he outlined his basic strategy to his chief of staff: “[President Nixon] emphasized that you have to face the fact that the whole problem is really the blacks. The key is to devise a system that recognizes this while not appearing to.”

Nixon’s diabolical efforts to develop a War on Drugs along these lines involved the highest officials in the US government, including William Rehnquist, later appointed Chief Justice of the Supreme Court by Reagan. Nixon initiated a war on crime as well as the War on Drugs, setting the pattern for future presidents.

Following in his predecessors’ footsteps, Reagan outdid Nixon in his get-tough-on-crime policies and oversaw the steepest rise in incarceration rates. Bill Clinton signed into law an omnibus crime bill in 1994, increasing capital offenses and the federal “three strikes” provision mandating life sentences for criminals convicted of a violent felony after two or more prior convictions, including drug crimes. He poured over $30 billion into militarizing the nation’s police. His group, the Democratic Leadership Council, brought much of the Democratic Party to embrace coded racial politics in order to win over white voters.

For a new beginning

As a movement to stop violent police repression grows across the nation, some of our current rulers seem to understand that they have a tiger by the tail. The Clinton team has begun to suggest that mass incarceration might end. Clinton, herself, as part of her presidential campaign, called “for a re-evaluation of prison sentences and trust between police and communities.”

The Black Lives Matter movement recognizes that discontent fueled by mass incarceration contributes to the movement to stop police murders. Less well-recognized is that granting the police immunity is itself part of the generalized repression of African Americans. The system of mass incarceration rests on a high degree of police discretion in choosing whom to suspect, interrogate and arrest, and in how to do these things. Restricting the police can hardly be allowed if the police are to continue the overall project of racial repression.

Part of developing a new revolutionary movement is to reclaim our history. The masters keep us enslaved by blinding us to our collective strength. The story of the ‘60s uprisings is one rich in power and agency; this is the reason why the rulers want to erase this period from the collective memory altogether.

At the same time, we must also recognize that the uprisings of the ’60s failed. Despite the vast strength revealed in the Great Rebellion, our enemies were able to use the images of violence and looting to further the divisions in US society and to institute their vengeful repression with at least the passive consent of the “white” majority. Time and again, the mainstream media proved a powerful tool in promoting the image of black and brown people as violent, criminal and dangerous.

It must be acknowledged that widespread looting and violence frightened the “white” majority, making it easier for the rulers to split the people and institute the Great Repression. King’s revolutionary non-violence had a much different effect on the American people. This must be pondered by serious revolutionaries.

Conditions for a new revolutionary movement are gradually maturing. There are growing rebellions seeking a new way of life throughout the world. In the US, an ever-spreading movement affirms the value of black lives as increasing numbers of European-American youth take up the struggle of African Americans as their own. Such a movement may, in time, bring an end to the socially constructed notion of whiteness, eliminating a key pillar of the rulers’ domination.

In the Virginia colony in the 17th century, the masters were horrified to see African and European laborers combine to seek to destroy the system of enslavement. Their response was to create a sharp division in condition between their African and their Europeans slaves. They “invented” the white race to split the laborers and preserve their power — a remarkably effective and durable approach.

Race is a social construct devised and manipulated by our masters to maintain their rule. Only by eliminating class society, which continues to depend on racism, can racism as such be swept away.

Paul Bermanszohn, son of Holocaust survivors, is a retired psychiatrist and lifelong political revolutionary. He was shot in the head in an assassination attempt in the 1979 Greensboro Massacre, in which five of his close comrades were killed. His web site is Survival and Transformation.

This article is an edited version of a talk presented at a meeting of the End the New Jim Crow Action Network, on 14 July 2015 (Bastille Day), Kingston, NY.

 

http://roarmag.org/2015/08/us-war-on-drugs-black-repression/

Amy, a documentary film about the British singer Amy Winehouse

By Joanne Laurier
12 August 2015

Directed by Asif Kapadia

British-born director Asif Kapadia’s documentary, Amy, about the pop singer Amy Winehouse (1983-2011), is a straightforward and compelling account of the performer’s life starting at the age of fourteen. Through video footage from a variety of devices and the voiceover comments of friends, family members and music industry figures (Kapadia conducted 100 interviews), the documentary paints a picture of an immensely talented and tortured musical prodigy.

During her eight-year recording career, beginning when she was still a teenager, Winehouse garnered numerous awards, including six Grammys. Her second album, Back to Black, released in October 2006, made her an international singing star. By the time of her death, she had sold more than six million albums in the UK and US alone. Kapadia’s film features a number of her biggest hits, “Rehab” (2006), “You Know I’m No Good” (2007), “Back to Black” (2007), “Love is a Losing Game” (2007), and her soulful duet with Tony Bennett, “Body and Soul” (2011).

Amy

From an early age, as the documentary reveals, Winehouse aspired above all to be a jazz singer. Among her most important influences were Dinah Washington, Sarah Vaughan, Ella Fitzgerald, Billie Holiday, Frank Sinatra, Bennett and others. But she also channeled many of the pop artists and trends of the 1960s and 1970s, including Motown, R&B, reggae, Carole King, James Taylor and “girl groups” like the Shirelles and the Ronettes. In her music and extraordinary voice one encounters a multitude of influences, each one distinct and yet blended together to create a personal and unique sound. A record industry figure notes that she was a “very old soul in a very young body.”

After Winehouse’s death, Bennett commented: “It was such a sad thing because … she was the only singer that really sang what I call the ‘right way,’ because she was a great jazz-pop singer.…A true jazz singer.”

The movie opens with footage of a close friend’s 14th birthday party in 1998, at which Winehouse offers an alluring, mischievous version of “Happy Birthday” à la Marilyn Monroe, and ends with the aftermath of her tragic death from alcohol poisoning in July 2011 at the age of 27.

A friend observes at one point that she was “a North London Jewish girl with a lot of attitude.” Her father Mitchell owned a cab and her mother Janis was a pharmacist. Her paternal grandmother Cynthia was a singer and at one time dated Ronnie Scott, the tenor saxophonist and owner of the best-known jazz club in London.

Kapadia’s Amy follows Winehouse from her teenage years to the beginnings of her professional music career in 2002 and beyond. We see a host of appearances and performances, both private and public, some of them intensely intimate and very affecting to the viewer. In some of these scenes, the young singer is disarmingly genuine, childlike and really adorable.

Three of Winehouse’s friends, including two from childhood, Juliette Ashby and Lauren Gilbert, and her first manager (when he was 19 and Winehouse was 16), Nick Shymansky, provide the most in-depth and believable portrait. Her father, her ex-husband Blake Fielder and her promoter-manager Raye Cosbert also feature prominently in the film, in a less favorable light.

At a certain point, of course, Amy gets down to business, which every viewer knows is coming—the singer’s [meteoric] Rise and [tragic] Fall, as it were. Much of the documentary details her successes and the severe complications or contradictions accompanying those successes.

Amy

Winehouse insists—and one feels, sincerely—on several occasions that celebrity and what comes with it is not her goal. As she told CNN in a 2007 interview, “I don’t write songs because I want my voice to be heard or I want to be famous or any of that stuff. I write songs about things I have problems with and I have to get past them and I have to make something good out of something bad.” Early on in the film, in fact, she asserts, “I’d probably go mad [if I were famous].” Later on: “If I really thought I was famous, I’d f—g go top [kill] myself.”

Tragically, Winehouse, already a bulimic since her adolescence and a heavy drinker early on in life, falls into heavy drug use. Kapadia’s documentary focuses perhaps too much on this aspect, as though this by itself could explain her fate.

The film effectively captures some of the ghastliness of the modern celebrity racket. Countless scenes record paparazzi camped outside her door and snapping photos of her every move, including the most crazed and desperate. Her ex-manager Shymansky told an interviewer, “the paparazzi were allowed to get brutally close…it’s this infatuation with getting up people’s skirts, or seeing someone vomit, or punching a paparazzi.”

As Winehouse goes to pieces in public, the media engages in what Shymansky calls, in the film, “a feeding frenzy.” Kapadia himself told the media, “This is a girl who had a mental illness, yet every comedian, every TV host, they all did it [bullied or laughed at her] with such ease, without even thinking. We all got carried away with it.”

The production notes for Amy suggest: “The combination of her raw honesty and supreme talent resulted in some of the most original and adored songs of the modern era.

“Her huge success, however, resulted in relentless and invasive media attention which coupled with Amy’s troubled relationships and precarious lifestyle saw her life tragically begin to unravel.”

The inquest into Winehouse’s death, according to the Daily Mail, found that she “drank herself to death … Three empty vodka bottles were found near her body in her bedroom. A pathologist who examined her said she had 416mg [milligrams] of alcohol per decilitre [3.38 fluid ounces] of blood—five times the legal drink-drive limit of 80mg. The inquest heard that 350mg was usually considered a fatal amount.”

Kapadia’s documentary is both valuable and intriguing. Because the director lets Winehouse speak (and sing) for herself, the viewer receives a relatively clear-eyed and balanced picture of both her artistry and her qualities as a human being. Amy rightfully points a finger at a predatory industry. Kapadia told NME (New Musical Express, the British music journalism magazine and website), “I was angry, and I wanted the audience to be angry. … This started off as a film about Amy, but it became a film about how our generation lives.” NME continues, “Kapadia hopes his film will force the music industry to re-examine its handling of young, troubled talents.”

In the interview, unfortunately, the director places too much of the blame on the public itself, as though people were in control of the information they received and were responsible for the operations of the entertainment industry.Amy, at more then two hours, is perhaps overly long because the filmmaker seems intent on driving home to the viewer his or her supposed “complicity.”

In opposition to this, the 2011 WSWS obituary of Winehouse argued that the ultimate responsibility for her death lay with “the intense … pressures generated by the publicity-mad, profit-hungry music business, which chews up its human material almost as consistently as it spits out new ‘product.’”

Comic Russell Brand, in a comment on the death of Winehouse, a close friend, characterized the celebrity culture as “a vampiric, cannibalizing system that wants its heroes and heroines dead so it can devour their corpses in public for entertainment.”

Stepping back, Amy Winehouse was definitely a cultural phenomenon. As opposed to many acts and performers, who ride on the crest of massive marketing campaigns, like bars of soap or automobiles, she came by her fame honestly, almost in spite of her efforts. She truly struck a chord with audiences and listeners.

This was not an accident. Her songs, in part because they brought to bear (and made new) so much popular musical history, registered with audiences as more substantial, truthful and urgent than the majority of current fare. Winehouse’s popularity reflected a dissatisfaction with the lazy, self-absorbed pablum that dominates the charts.

In terms of the combination of factors that led to Winehouse’s death, of course there were the individual circumstances of her background and life. The entertainment industry juggernaut inflicts itself on everyone, but only the most vulnerable collapse under what is to them an unbearable burden.

As always in such cases, the media self-servingly treated the singer’s death as a purely personal episode. The Daily Mirror, for example, fatuously suggested that Winehouse was “a talent dogged by self destruction.”

Surely, something more than this, or what Amy offers as an explanation (drugs, difficult family history, a bad marriage, a cold-blooded industry), for that matter, is called for. Why would someone at the height of her global fame and popularity bring about the end of her life so abruptly and “needlessly”? What made her so wretched and conflicted?

To begin to get at an answer, one must look at the more general circumstances of her life, including the character of the period in which she lived…and died.

She came of age and later gained public attention between the years 2001 and 2011, in other words, a decade dominated by “the war on terror” and the politicians’ “big lie” and hostility to democratic rights (the Blair government in particular prided itself on flouting the public will), as well as by global economic turbulence and sharp social polarization. The generation she belonged to increasingly looked to the future with skepticism and even alarm. A serious darkness descended into more than one soul.

One study of American college-age students in the first decade of the 21st century, and this could certainly be applied to British young people as well, sums them up as a “generation on a tightrope,” facing an “abyss that threatens to dissolve and swallow them,” “seeking security but [living] in an age of profound and unceasing change.”

Amy Winehouse, as far as this reviewer is aware (or the film would indicate), never addressed a single broader social issue, including the Blair government’s involvement in the Iraq War, which provoked one of the largest protest demonstrations in British history in February 2003. Nonetheless, the drama, anxiety and sensitivity of her music speaks to something about both the turbulent character of the decade and the dilemma of a generation whose dreams and idealism came up against the brutal realities of the existing social set-up.

The manner in which Winehouse approached the latter “dilemma” was refracted through the objectively shaped confusion and difficulties of that same generation, which finds itself hostile toward dominant institutions and yet not entirely clear why. In that light, Winehouse’s famous refrain in “Rehab,” “They tried to make me go to rehab, but I said, ‘No, no, no,’” which from one point of view might simply seem self-indulgent, comes across as a firm (if misguided in this case) rejection of intrusive orders from above.

In part due to an unfavorable and unsympathetic social atmosphere, a conscious or unconscious alienation from official public life, Winehouse turned inward and reduced these significant feelings into purely personal passions and self-directed anger, and ultimately, with her lowered psychic immune system, found herself a victim of that rage and disorientation.

Kapadia’s Amy does not go anywhere near some of these critical issues, but it is a worthwhile introduction to the work of a remarkably gifted artist.

 

 

http://www.wsws.org/en/articles/2015/08/12/amyw-a12.html

Fifty years on: Medicare under assault

grview-35178-1

31 July 2015

President Lyndon B. Johnson signed Medicare, the national health insurance program for Americans 65 years of age and older, into law on July 30, 1965. Medicare and the accompanying Medicaid health program for the poor were the last major social reforms enacted in the US and came at a time of intense crisis for American capitalism.

The mid-1960s saw a nation gripped by the civil rights movement and militant struggles by workers for higher wages and improved social conditions. Two weeks before Johnson signed the Medicare bill, a riot broke out in Harlem, New York following the shooting of a black teenager, one of the earliest of the numerous urban rebellions that would erupt over the next three years.

In the US pursuit of global domination, on March 8, 1965, 3,500 US Marines were dispatched to South Vietnam, marking the beginning of the US ground war in Southeast Asia. Only two days before signing Medicare into law, Johnson announced the doubling of draft quotas and the dispatch of another 50,000 troops to Vietnam. The war would end in a humiliating defeat for US imperialism a decade later, after the deaths of more than 58,000 Americans and millions of Vietnamese.

As with the Social Security Act under Franklin D. Roosevelt in 1935 and the establishment of industrial unions, Medicare was not granted out of the kindness of the hearts of the ruling class. It came as a concession to mass struggles carried out by the working class.

However, by today’s standards, passages from the Democratic Party platform on which Johnson ran in 1964 sound radical. In a section titled “The Individual,” the platform reads: “The health of the people is important to the strength and purpose of our country and is a proper part of our common concern. In a nation that lacks neither compassion nor resources, the needless suffering of people who cannot afford adequate medical care is intolerable.”

From the start, Medicare fell far short of providing free and comprehensive medical care for all seniors. As originally enacted, the program provided for inpatient hospital care (Part A) as well as certain outpatient services (Part B), including preventive services, ambulance transport, mental health and other medical services. Part B has always required a premium payment.

In 1972, President Richard Nixon signed legislation expanding coverage for those under age 65 with long-term disabilities and end-stage renal disease. Since 1997, enrollees had the option to enroll in Medicare Advantage (Part C), managed care programs administered by private companies. It was not until 2002 that optional prescription drug benefits (Part D), exclusively provided through private plans, were added under George W. Bush.

It is important to note that all components of Medicare, except for Part A in certain instances, carry premiums and deductibles. Despite these shortcomings, Medicare represented an important, albeit limited, advance in health care for seniors that was denounced as “socialism” in many ruling class circles.

The Medicare legislation faced significant opposition in both big business parties. The Democratic vote in favor of the bill was 57-7 in the Senate and 237-48 in the House. The Republicans opposed the bill 13-17 in the Senate and narrowly approved it in the House, 70-68.

Hostility to the legislation among leading Republicans was vociferous. Senator Barry Goldwater commented in 1964: “Having given our pensioners their medical care in kind, why not food baskets, why not public housing accommodations, why not vacation resorts, why not a ration of cigarettes for those who smoke and of beer for those who drink?”

In 1964, future president George H.W. Bush denounced the impending Medicare bill as “socialized medicine.” While it was nothing of the sort, it was seen by many supporters as a first step toward the establishment of universal health care.

Despite its limitations, it is undisputable that the program has had an immense impact on the health and social wellbeing of the elderly population.

Largely as a result of Medicare and improved medical technologies, life expectancy at age 60 increased from 14.3 years in 1960 to 19.3 years in 2012. Prior to Medicare, about half of America’s seniors did not have hospital insurance, more than one in four elderly went without medical care due to cost, and one in three seniors lived in poverty.

Some 53 million elderly are currently enrolled in Medicare. Today, virtually all seniors have access to health care and only about 14 percent live below the poverty line. Despite a relentless attack on Medicare services in recent years, Medicare is extremely popular—with 77 percent of Americans viewing it as a “very important” program that needs to be defended, according to a recentpoll.

The program has been under assault from sections of the political establishment and corporate America since its inception. In 1995, under the leadership of then-House Speaker Newt Gingrich, Republicans proposed cutting 14 percent from projected Medicare spending and forcing millions of elderly recipients into managed health programs. The aim, in Gingrich’s words, was to ensure that Medicare was “going to wither on the vine.”

In the most open threat to privatize Medicare, in the spring of 2014, Rep. Paul Ryan, Republican of Wisconsin, released a “Path to Prosperity” budget plan that slashed $5.1 trillion over 10 years. Key to his blueprint was the institution of “premium support” in health care for seniors, essentially a voucher plan under which seniors could purchase either private insurance or Medicare coverage.

Fast-forward to the current presidential campaign. Republican candidate Jeb Bush, speaking at an event last week in New Hampshire sponsored by the billionaire Koch brothers, said of Medicare: “We need to figure out a way to phase out this program … and move to a new system that allows them [those over 65] to have something—because they’re not going to have anything.”

Bush and others justify their proposals to privatize or outright abolish Medicare with claims that the program will be bankrupt in the near future. But a recent report shows that projected Medicare spending will account for 6 percent of Gross Domestic Product by 2090, down from earlier projections that it would make up 13 percent of GDP in 2080.

This is hardly an unreasonable amount to spend on the health of the nation’s elderly population. This spending is also not a gift from the government, but is funded through deductions from the paychecks of workers all their working lives. However, the policy decisions of politicians in Washington are not driven by preserving the health and welfare of America’s older citizens, but by the defense of the capitalist profit system.

While President Obama and the Democrats seek to distance themselves from proposals to privatize Medicare, Ryan and Bush only openly express what many Democrats are thinking. The Obama administration, with the Affordable Care Act (ACA) leading the charge, is working to gut Medicare and transform it into a poverty program with barebones coverage for the majority of working class and middle class seniors.

In 2013, the Congressional Budget Office estimated that the ACA would reduce Medicare spending by $716 billion from 2013 to 2022. Under the first four years of the ACA, home health care under Medicare is being cut by 14 percent, including $60 million in 2015 and $350 million in 2016. While doing nothing to rein in the outrageous charges by pharmaceutical companies for cancer and other life-saving drugs, the Obama administration’s proposed 2016 budget includes $126 billion in cuts from what Medicare will pay for these drugs.

In what constitutes a historic attack on the program, Obama hailed as a “bipartisan achievement” passage of a bill in April that expands means testing for Medicare and establishes a new payment system in which doctors will be rewarded for cutting costs, while being punished for the volume and frequency of the health care services they provide.

It is telling that an article in the right-wing National Review, headlined “A Medicare Bill Conservatives Need to Embrace,” hailed the legislation and said the effects of its structural reforms would be “permanent and cumulative.”

The bipartisan backing for the Medicare bill is based on common agreement that Medicare spending must be slashed and a radical shift instituted away from the “lavish” fee-for-service system, in which supposed “unnecessary” tests and procedures are performed on Medicare patients, needlessly treating disease and extending their lives.

The president has claimed that the enactment of the program commonly known as Obamacare is the most sweeping social reform since Medicare was signed into law. This is a cynical lie. The ACA is, in fact, a social counter-reform that was aimed from the start at cutting costs for the government and corporations and reducing and rationing health care for the majority of Americans.

The ACA is designed to encourage employers to slash or end their employee insurance plans, forcing workers to individually purchase plans from private companies on government-run exchanges. The result will be the dismantling of the employer-provided health insurance system that has existed since the early 1950s, a vast increase in workers’ out-of-pocket costs, and a decrease in the care they receive.

Medicare, one of the last vestiges of social reform from a previous era, along with Social Security, is being undermined. The social right to health care—along with the right to a livable income, education, housing, and a secure retirement—is incompatible with a society subordinated to capitalist forces.

True reform of the health care system requires that it be reorganized based on a socialist program that proceeds from the fulfillment of human needs, not the enrichment of a parasitic elite.

Kate Randall

 

http://www.wsws.org/en/articles/2015/07/31/pers-j31.html

US health insurers seek huge rate increases for 2016

obamacare-hurt

By Kate Randall
6 July 2015

Health insurance companies across the US are seeking rate increases of 20 percent to 40 percent and more, according to filings by the insurers with state insurance commissions. Insurance companies cite a larger than expected pool of unhealthy enrollees, high drug prices, and diminishing profits as contributing factors requiring the premium hikes.

The rate increase requests are the latest demonstration of the pro-corporate character of the Affordable Care Act (ACA), commonly known as Obamacare. The news follows the US Supreme Court’s 5-4 ruling June 25, which upheld government tax subsidies, a critical component of the law that provides tax credits to those purchasing insurance coverage on all the exchanges set up under the ACA.

Under Obamacare’s “individual mandate,” uninsured individuals and families must obtain insurance or face a tax penalty. The premiums for plans purchased on the ACA exchanges go directly into the coffers of the private insurers.

Blue Cross and Blue Shield, one of the nation’s largest insurers, is seeking double-digit increases in many states, including a 23 percent hike in Illinois, 25 percent in North Carolina, 31 percent in Oklahoma, 36 percent in Tennessee, and 54 percent in Minnesota.

The ACA, signed into law in 2010, requires insurance companies to disclose large proposed increases in premiums, and increases of 10 percent or more must be made public and are subject to review under federal law. However, there is no mechanism to rein in the rate hikes if state insurance commissions approve them.

In cynical comments made last week in an appearance in Tennessee, Barack Obama said consumers should put pressure on state insurance regulators to examine the rate requests carefully. “My expectation,” he said, “is that they’ll come in significantly lower than what’s being requested.”

In one example of the opposite result, Oregon Insurance Commissioner Laura N. Cali has just approved rate increases for companies that cover more than 220,000 people. Moda Health Plan, with the state’s largest enrollment, received a 25 percent increase; LifeWise, the second-largest Oregon plan, received a 33 percent hike.

In some cases, state insurance commissions have already granted insurance hikes in excess of those requested by the insurers. In Oregon, Health Net requested increases averaging 9 percent and was granted increases averaging 34.8 percent. Another insurer in the state, Health Co-op, requested a 5.3 percent increase and the state approved a 19.9 percent hike.

Insurers cite the fact that new customers enrolling in ACA plans have turned out to be sicker than expected, which leaves the insurers with a more unhealthy pool of insured customers, requiring them to increase premiums. This is hardly a shocking development, as significant numbers of younger, healthier people have chosen to remain uninsured and risk the cost of getting sick and needing medical care that would have to be paid out-of-pocket.

However, these young people are not simply paying Russian roulette with their health. The driving force behind the decision of many not to enroll in coverage—including the so-called young invincibles, and many workers—is the economic reality that they cannot afford the premiums. To add injury to insult, they face the prospect of being both uninsured and paying tax penalties under the Affordable Care act. For individuals, these penalties for the uninsured were $95 in 2014, rose to $325 in 2015, and will increase to $695 in 2016.

Another factor contributing to the increased pool of unhealthy customers is a policy adopted by the Obama administration in late 2013 that allowed people to keep insurance plans that did not meet new federal standards, including a set of required coverage, including wellness checks and some screenings. The exemption was a political move on the president’s part to make good on his earlier statements that “If you like your plan, you can keep it.”

Customers may also be required to switch plans in order to keep premium costs down. The Kaiser Family Foundation analyzed 2016 premium changes in 10 states and the District of Columbia where the group found complete data for all insurers for the lowest- and second-lowest cost “silver” plans.

For example, Kaiser found that in Seattle, Washington, an unsubsidized person enrolled in the second-lowest silver plan offered by BridgeSpan in 2015 would see a 12.6 percent premium increase if the individual stayed in the same plan, but would pay 10.1 percent less if the person switched to a similar plan offered by Ambetter.

Kaiser found that those switching plans to save money would potentially have to switch doctors and hospitals as well, and that staying with one’s plan also did not guarantee maintaining a provider network. The entire framework of the ACA is thus skewed to the whims of the insurers, and customers are required to scramble each year to determine their selection.

Health and Human Services (HHS) Secretary Sylvia Burwell told the Times, “You have a marketplace where there is competition and people can shop for the plan that best meets their needs in terms of quality and price.”

The HHS secretary did not mention that the most affordable “bronze” plans come with deductibles in excess of $5,000, which means that coverage for all but “essential” medical services do not kick in until the deductible is paid out-of-pocket. This means that despite being insured, many people will go without health care for themselves or their children, resulting in needless suffering and deaths.

Some of the premium increase requests by the private insurers are staggering. Blue Cross and Blue Shield of New Mexico has requested rate hikes averaging 51 percent for its 33,000 enrollees. Scott & White Health Plan in Texas is seeking a 32 percent rate increase. In a ludicrous statement to the New York Times, Scott & White’s CEO Marinan R. Williams said that the rate hike requests show that “there was a real need for the Affordable Care Act.”

Arches Health Plan, which covers about a quarter of the people insured through Obamacare in Utah, says it collected premiums of $39.7 million in 2014, but had claims of $56.3 million in 2014. The insurer has requested rate increases averaging 45 percent for 2016.

The Obama administration has touted a provision of Obamacare that requires insurers to spend at least 80 percent of premiums on medical care and related activities. How this works out in reality, however, is that if the profit margins are not to the insurers’ liking, they request premium increases to generate the revenue to pay stockholder dividends and the bloated salaries of insurance company executives.

The CEOs of the top five for-profit health insurance companies—Aetna, Anthem, Cigna, Humana and UnitedHealth—all took home at least $10 million in 2014, according to filings with the Securities and Exchange Commission. Executive compensation ranged from $10.1 million for Humana CEO Bruce D. Broussard to more than $15 million for Aetna CEO Mark Bertolini.

In the latest round of mergers as a byproduct of Obamacare, Aetna Inc. and Humana Inc. announced last week they had reached a deal to merge, creating an insurance company valued at $37 billion. If approved by the government and carried through, the insurer would become the nation’s second largest, covering more than 10 percent of the US population.

 

http://www.wsws.org/en/articles/2015/07/06/prem-j06.html

It Is Expensive to Be Poor

Minimum-wage jobs are physically demanding, have unpredictable schedules, and pay so meagerly that workers can’t save up enough to move on.


Binita Pradham is a single mother who runs a food business and raises her 4-year-old son. (Barbara Reis)

Fifty years ago, President Lyndon B. Johnson made a move that was unprecedented at the time and remains unmatched by succeeding administrations. He announced a War on Poverty, saying that its “chief weapons” would be “better schools, and better health, and better homes, and better training, and better job opportunities.”

So starting in 1964 and for almost a decade, the federal government poured at least some of its resources in the direction they should have been going all along: toward those who were most in need. Longstanding programs like Head Start, Legal Services, and the Job Corps were created. Medicaid was established. Poverty among seniors was significantly reduced by improvements in Social Security.

Johnson seemed to have established the principle that it is the responsibility of government to intervene on behalf of the disadvantaged and deprived. But there was never enough money for the fight against poverty, and Johnson found himself increasingly distracted by another and deadlier war—the one in Vietnam. Although underfunded, the War on Poverty still managed to provoke an intense backlash from conservative intellectuals and politicians.

In their view, government programs could do nothing to help the poor because poverty arises from the twisted psychology of the poor themselves. By the Reagan era, it had become a cornerstone of conservative ideology that poverty is caused not by low wages or a lack of jobs and education, but by the bad attitudes and faulty lifestyles of the poor.

Picking up on this theory, pundits and politicians have bemoaned the character failings and bad habits of the poor for at least the past 50 years. In their view, the poor are shiftless, irresponsible, and prone to addiction. They have too many children and fail to get married. So if they suffer from grievous material deprivation, if they run out of money between paychecks, if they do not always have food on their tables—then they have no one to blame but themselves.

In the 1990s, with a bipartisan attack on welfare, this kind of prejudice against the poor took a drastically misogynistic turn. Poor single mothers were identified as a key link in what was called “the cycle of poverty.” By staying at home and collecting welfare, they set a toxic example for their children, who—important policymakers came to believe—would be better off being cared for by paid child care workers or even, as Newt Gingrich proposed, in orphanages.
Welfare “reform” was the answer, and it was intended not only to end financial support for imperiled families, but also to cure the self-induced “culture of poverty” that was supposedly at the root of their misery. The original welfare reform bill—a bill, it should be recalled, which was signed by President Bill Clinton—included an allocation of $100 million for “chastity training” for low-income women.

The Great Recession should have put the victim-blaming theory of poverty to rest. In the space of only a few months, millions of people entered the ranks of the officially poor—not only laid-off blue-collar workers, but also downsized tech workers, managers, lawyers, and other once-comfortable professionals. No one could accuse these “nouveau poor” Americans of having made bad choices or bad lifestyle decisions. They were educated, hardworking, and ambitious, and now they were also poor—applying for food stamps, showing up in shelters, lining up for entry-level jobs in retail. This would have been the moment for the pundits to finally admit the truth: Poverty is not a character failing or a lack of motivation. Poverty is a shortage of money.

For most women in poverty, in both good times and bad, the shortage of money arises largely from inadequate wages. When I worked on my book, Nickel and Dimed: On (Not) Getting By in America, I took jobs as a waitress, nursing-home aide, hotel housekeeper, Wal-Mart associate, and a maid with a house-cleaning service. I did not choose these jobs because they were low-paying. I chose them because these are the entry-level jobs most readily available to women.

What I discovered is that in many ways, these jobs are a trap: They pay so little that you cannot accumulate even a couple of hundred dollars to help you make the transition to a better-paying job. They often give you no control over your work schedule, making it impossible to arrange for child care or take a second job. And in many of these jobs, even young women soon begin to experience the physical deterioration—especially knee and back problems—that can bring a painful end to their work life.
I was also dismayed to find that in some ways, it is actually more expensive to be poor than not poor. If you can’t afford the first month’s rent and security deposit you need in order to rent an apartment, you may get stuck in an overpriced residential motel. If you don’t have a kitchen or even a refrigerator and microwave, you will find yourself falling back on convenience store food, which—in addition to its nutritional deficits—is also alarmingly overpriced. If you need a loan, as most poor people eventually do, you will end up paying an interest rate many times more than what a more affluent borrower would be charged. To be poor—especially with children to support and care for—is a perpetual high-wire act.

Most private-sector employers offer no sick days, and many will fire a person who misses a day of work, even to stay home with a sick child. A nonfunctioning car can also mean lost pay and sudden expenses. A broken headlight invites a ticket, plus a fine greater than the cost of a new headlight, and possible court costs. If a creditor decides to get nasty, a court summons may be issued, often leading to an arrest warrant. No amount of training in financial literacy can prepare someone for such exigencies—or make up for an income that is impossibly low to start with. Instead of treating low-wage mothers as the struggling heroines they are, our political culture still tends to view them as miscreants and contributors to the “cycle of poverty.”

If anything, the criminalization of poverty has accelerated since the recession, with growing numbers of states drug testing applicants for temporary assistance, imposing steep fines for school truancy, and imprisoning people for debt. Such measures constitute a cruel inversion of the Johnson-era principle that it is the responsibility of government to extend a helping hand to the poor. Sadly, this has become the means by which the wealthiest country in the world manages to remain complacent in the face of alarmingly high levels of poverty: by continuing to blame poverty not on the economy or inadequate social supports, but on the poor themselves.

It’s time to revive the notion of a collective national responsibility to the poorest among us, who are disproportionately women and especially women of color. Until that happens, we need to wake up to the fact that the underpaid women who clean our homes and offices, prepare and serve our meals, and care for our elderly—earning wages that do not provide enough to live on—are the true philanthropists of our society.

 

Microdosing: A New, Low-Key Way to Use Psychedelics

Some of the most surprising people are using LSD and other psychedelics in extremely low doses, and the results are most interesting.

Photo Credit: agsandrew / Shutterstock.com

At the fifth annual Horizons: Perspectives on Psychedelics conference in New York City in October 2011, pioneering psychedelic researcher Dr. James Fadiman solidified his reemergence as a leading researcher of and advocate for psychedelic substances. Fadiman had done groundbreaking research with LSD up until the very day it was federally banned in 1966, but after that, he retreated into a life of quiet conventionality—at least on the surface.

While Fadiman disappeared himself from the public eye for decades, he never did give up him interest in and enthusiasm for psychedelics. A year before appearing at Horizons, he published his life’s work, The Psychedelic Explorer’s Guide, an amazing compendium of hallucinogenic lore, as well as a user’s manual for would-be psychonauts.

The book examined the primary uses for psychedelics, such as spiritual enlightenment at high doses and improvements in creativity at smaller ones. It also addressed a lesser-known but increasingly popular phenomenon: microdosing.

Microdosing refers to taking extremely small doses of psychedelics, so small that the affects usually associated with such drugs are not evident or are “sub-perceptual,” while going about one’s daily activities. It’s being done by anyone from harried professionals to extreme athletes to senior citizen businesswomen, and they’re claiming serious benefits from it.

To trip brains (or have a transcendental experience) on LSD, a dose of 400 micrograms or more is called for; to explore your inner self, take 200 micrograms; for creative problem solving, try 100 mikes; but for microdosing, take only 10 to 15 micrograms. Similar microdoses for other psychedelics would include 0.2-0.5 grams of dried mushrooms (about one-fifth the normal dose) or about 50-75 micrograms of mescaline.

At that Horizons conference, as reported by Tim Doody in a fascinating profile of Fadiman, the bespectacled 70-year-old at one point asked his audience “How many of you have heard about microdosing?” A couple of dozen hands went up. “Whoa,” he exclaimed.

He explained that, beginning in 2010, he had been doing a study of microdosing. Since research with LSD remains banned, he couldn’t do it in a lab, but had instead relied on a network of volunteers who administered their own doses and reported back with the results. The subjects kept logs of their doses and daily routines, and sent them via email to Fadiman. The results were quite interesting, he said.

“Micro-dosing turns out to be a totally different world,” he explained. “As someone said, the rocks don’t glow, even a little bit. But what many people are reporting is, at the end of the day, they say, ‘That was a really good day.’ You know, that kind of day when things kind of work. You’re doing a task you normally couldn’t stand for two hours, but you do it for three or four. You eat properly. Maybe you do one more set of reps. Just a good day. That seems to be what we’re discovering.”

Study participants functioned normally in their work and relationships, Fadiman said, but with increased focus, emotional clarity, and creativity. One physician reported that microdosing put him “in touch with a deep place of ease and beauty.” A singer reported being better able to hear and channel music.

In his book, a user named “Madeline” offered this report: “Microdosing of 10 to 20 micrograms (of LSD) allow me to increase my focus, open my heart, and achieve breakthrough results while remaining integrated within my routine. My wit, response time, and visual and mental acuity seem greater than normal on it.”

These results are not yet peer-reviewed, but they are suggestive.

“I just got a report from someone who did this for six weeks,” Fadiman said. “And his question to me was, ‘Is there any reason to stop?’”

It isn’t just Fadiman acolytes who are singing the praises of microdosing. One 65-year-old Sonoma County, California, small businesswoman who had never heard of the man told AlterNet she microdosed because it made her feel better and more effective.

“I started doing it in 1980, when I lived in San Francisco and one of my roommates had some mushrooms in the fridge,” said the woman, who asked to remain anonymous. “I just took a tiny sliver and found that it made me alert and energized all day. I wasn’t high or anything; it was more like having a coffee buzz that lasted all day long.”

This woman gave up on microdosing when her roommate’s supply of ‘shrooms ran out, but she has taken it up again recently.

“I’m very busy these days and I’m 65, so I get tired, and maybe just a little bit surly sometimes,” she admitted. “So when a friend brought over some chocolate mushrooms, I decided to try it again. It makes my days so much better! My mood improves, my energy level is up, and I feel like my synapses are really popping. I get things done, and I don’t notice any side-effects whatsoever.”

She’s not seeking visionary experiences, just a way to get through the day, she said.

In an in-depth post on the High Existence blog, Martijn Schirp examined the phenomenon in some detail, as well as describing his own adventure in microdosing:

“On a beautiful morning in Amsterdam, I grabbed my vial of LSD, diluted down with half high grade vodka and half distilled water, and told my friend to trust me and open his mouth. While semi-carefully measuring the droplets for his microdose, I told him to whirl it around in his mouth for a few minutes before swallowing the neuro-chemical concoction. I quickly followed suit,” Schirp wrote. “We had one of the best walking conversations of our lives.”

James Oroc, author of Tryptamine Palace: 5-MeO-DMT and the Sonoran Desert Toad, exposed another realm where microdosing is gaining popularity. In a Multidisciplinary Association for Psychedelic Studies monograph titled“Psychedelics and Extreme Sports,” Oroc extolled the virtues of microdosing for athletes. Taking low-dose psychedelics improved “cognitive functioning, emotional balance, and physical stamina,” he wrote.

“Virtually all athletes who learn to use LSD
at psycholytic [micro] dosages believe that the use of these compounds improves both their stamina and their abilities,” Oroc continued. “According to the combined reports of 40 years of use by the extreme sports underground, LSD can increase your reflex time to lightning speed, improve your balance to the point of perfection, increase your concentration until you experience ‘tunnel vision,’ and make you impervious to weakness or pain. LSD’s effects in these regards amongst the extreme-sport community are in fact legendary, universal, and without dispute.”

Even the father of LSD, Albert Hofman seems to have been a fan. In his book, Fadiman notes that Hofmann microdosed himself well into old age and quoted him as saying LSD “would have gone on to be used as Ritalin if it hadn’t been so harshly scheduled.”

Psychonauts, take note. Microdosing isn’t going to take you to another astral plane, but it may help you get through the day. For more infomation on the microdosing experience, dig into the links up-story, as well as the Reddit user forum on microdosing. Surprisingly enough, the vaults of Erowid, that repository of drug user experiences, returned only one entry about microdosing, from someone who appears to have been a subject in the Fadiman microdosing experiments.

And, of course, if you want to try this, you have to obtain some psychedelics. They’re illegal, which doesn’t mean they aren’t around. An increasing number of people are finding them on the dark web; others obtain them the old-fashioned way: from within their own communities. Those who are really interested will get to work.

 

Phillip Smith is editor of the AlterNet Drug Reporter and author of the Drug War Chronicle.

 

http://www.alternet.org/drugs/microdosing-new-low-key-way-use-psychedelics?akid=13216.265072.1ZXgbS&rd=1&src=newsletter1037865&t=13

Follow

Get every new post delivered to your Inbox.

Join 1,694 other followers