Fifty years on: Medicare under assault

grview-35178-1

31 July 2015

President Lyndon B. Johnson signed Medicare, the national health insurance program for Americans 65 years of age and older, into law on July 30, 1965. Medicare and the accompanying Medicaid health program for the poor were the last major social reforms enacted in the US and came at a time of intense crisis for American capitalism.

The mid-1960s saw a nation gripped by the civil rights movement and militant struggles by workers for higher wages and improved social conditions. Two weeks before Johnson signed the Medicare bill, a riot broke out in Harlem, New York following the shooting of a black teenager, one of the earliest of the numerous urban rebellions that would erupt over the next three years.

In the US pursuit of global domination, on March 8, 1965, 3,500 US Marines were dispatched to South Vietnam, marking the beginning of the US ground war in Southeast Asia. Only two days before signing Medicare into law, Johnson announced the doubling of draft quotas and the dispatch of another 50,000 troops to Vietnam. The war would end in a humiliating defeat for US imperialism a decade later, after the deaths of more than 58,000 Americans and millions of Vietnamese.

As with the Social Security Act under Franklin D. Roosevelt in 1935 and the establishment of industrial unions, Medicare was not granted out of the kindness of the hearts of the ruling class. It came as a concession to mass struggles carried out by the working class.

However, by today’s standards, passages from the Democratic Party platform on which Johnson ran in 1964 sound radical. In a section titled “The Individual,” the platform reads: “The health of the people is important to the strength and purpose of our country and is a proper part of our common concern. In a nation that lacks neither compassion nor resources, the needless suffering of people who cannot afford adequate medical care is intolerable.”

From the start, Medicare fell far short of providing free and comprehensive medical care for all seniors. As originally enacted, the program provided for inpatient hospital care (Part A) as well as certain outpatient services (Part B), including preventive services, ambulance transport, mental health and other medical services. Part B has always required a premium payment.

In 1972, President Richard Nixon signed legislation expanding coverage for those under age 65 with long-term disabilities and end-stage renal disease. Since 1997, enrollees had the option to enroll in Medicare Advantage (Part C), managed care programs administered by private companies. It was not until 2002 that optional prescription drug benefits (Part D), exclusively provided through private plans, were added under George W. Bush.

It is important to note that all components of Medicare, except for Part A in certain instances, carry premiums and deductibles. Despite these shortcomings, Medicare represented an important, albeit limited, advance in health care for seniors that was denounced as “socialism” in many ruling class circles.

The Medicare legislation faced significant opposition in both big business parties. The Democratic vote in favor of the bill was 57-7 in the Senate and 237-48 in the House. The Republicans opposed the bill 13-17 in the Senate and narrowly approved it in the House, 70-68.

Hostility to the legislation among leading Republicans was vociferous. Senator Barry Goldwater commented in 1964: “Having given our pensioners their medical care in kind, why not food baskets, why not public housing accommodations, why not vacation resorts, why not a ration of cigarettes for those who smoke and of beer for those who drink?”

In 1964, future president George H.W. Bush denounced the impending Medicare bill as “socialized medicine.” While it was nothing of the sort, it was seen by many supporters as a first step toward the establishment of universal health care.

Despite its limitations, it is undisputable that the program has had an immense impact on the health and social wellbeing of the elderly population.

Largely as a result of Medicare and improved medical technologies, life expectancy at age 60 increased from 14.3 years in 1960 to 19.3 years in 2012. Prior to Medicare, about half of America’s seniors did not have hospital insurance, more than one in four elderly went without medical care due to cost, and one in three seniors lived in poverty.

Some 53 million elderly are currently enrolled in Medicare. Today, virtually all seniors have access to health care and only about 14 percent live below the poverty line. Despite a relentless attack on Medicare services in recent years, Medicare is extremely popular—with 77 percent of Americans viewing it as a “very important” program that needs to be defended, according to a recentpoll.

The program has been under assault from sections of the political establishment and corporate America since its inception. In 1995, under the leadership of then-House Speaker Newt Gingrich, Republicans proposed cutting 14 percent from projected Medicare spending and forcing millions of elderly recipients into managed health programs. The aim, in Gingrich’s words, was to ensure that Medicare was “going to wither on the vine.”

In the most open threat to privatize Medicare, in the spring of 2014, Rep. Paul Ryan, Republican of Wisconsin, released a “Path to Prosperity” budget plan that slashed $5.1 trillion over 10 years. Key to his blueprint was the institution of “premium support” in health care for seniors, essentially a voucher plan under which seniors could purchase either private insurance or Medicare coverage.

Fast-forward to the current presidential campaign. Republican candidate Jeb Bush, speaking at an event last week in New Hampshire sponsored by the billionaire Koch brothers, said of Medicare: “We need to figure out a way to phase out this program … and move to a new system that allows them [those over 65] to have something—because they’re not going to have anything.”

Bush and others justify their proposals to privatize or outright abolish Medicare with claims that the program will be bankrupt in the near future. But a recent report shows that projected Medicare spending will account for 6 percent of Gross Domestic Product by 2090, down from earlier projections that it would make up 13 percent of GDP in 2080.

This is hardly an unreasonable amount to spend on the health of the nation’s elderly population. This spending is also not a gift from the government, but is funded through deductions from the paychecks of workers all their working lives. However, the policy decisions of politicians in Washington are not driven by preserving the health and welfare of America’s older citizens, but by the defense of the capitalist profit system.

While President Obama and the Democrats seek to distance themselves from proposals to privatize Medicare, Ryan and Bush only openly express what many Democrats are thinking. The Obama administration, with the Affordable Care Act (ACA) leading the charge, is working to gut Medicare and transform it into a poverty program with barebones coverage for the majority of working class and middle class seniors.

In 2013, the Congressional Budget Office estimated that the ACA would reduce Medicare spending by $716 billion from 2013 to 2022. Under the first four years of the ACA, home health care under Medicare is being cut by 14 percent, including $60 million in 2015 and $350 million in 2016. While doing nothing to rein in the outrageous charges by pharmaceutical companies for cancer and other life-saving drugs, the Obama administration’s proposed 2016 budget includes $126 billion in cuts from what Medicare will pay for these drugs.

In what constitutes a historic attack on the program, Obama hailed as a “bipartisan achievement” passage of a bill in April that expands means testing for Medicare and establishes a new payment system in which doctors will be rewarded for cutting costs, while being punished for the volume and frequency of the health care services they provide.

It is telling that an article in the right-wing National Review, headlined “A Medicare Bill Conservatives Need to Embrace,” hailed the legislation and said the effects of its structural reforms would be “permanent and cumulative.”

The bipartisan backing for the Medicare bill is based on common agreement that Medicare spending must be slashed and a radical shift instituted away from the “lavish” fee-for-service system, in which supposed “unnecessary” tests and procedures are performed on Medicare patients, needlessly treating disease and extending their lives.

The president has claimed that the enactment of the program commonly known as Obamacare is the most sweeping social reform since Medicare was signed into law. This is a cynical lie. The ACA is, in fact, a social counter-reform that was aimed from the start at cutting costs for the government and corporations and reducing and rationing health care for the majority of Americans.

The ACA is designed to encourage employers to slash or end their employee insurance plans, forcing workers to individually purchase plans from private companies on government-run exchanges. The result will be the dismantling of the employer-provided health insurance system that has existed since the early 1950s, a vast increase in workers’ out-of-pocket costs, and a decrease in the care they receive.

Medicare, one of the last vestiges of social reform from a previous era, along with Social Security, is being undermined. The social right to health care—along with the right to a livable income, education, housing, and a secure retirement—is incompatible with a society subordinated to capitalist forces.

True reform of the health care system requires that it be reorganized based on a socialist program that proceeds from the fulfillment of human needs, not the enrichment of a parasitic elite.

Kate Randall

 

http://www.wsws.org/en/articles/2015/07/31/pers-j31.html

US health insurers seek huge rate increases for 2016

obamacare-hurt

By Kate Randall
6 July 2015

Health insurance companies across the US are seeking rate increases of 20 percent to 40 percent and more, according to filings by the insurers with state insurance commissions. Insurance companies cite a larger than expected pool of unhealthy enrollees, high drug prices, and diminishing profits as contributing factors requiring the premium hikes.

The rate increase requests are the latest demonstration of the pro-corporate character of the Affordable Care Act (ACA), commonly known as Obamacare. The news follows the US Supreme Court’s 5-4 ruling June 25, which upheld government tax subsidies, a critical component of the law that provides tax credits to those purchasing insurance coverage on all the exchanges set up under the ACA.

Under Obamacare’s “individual mandate,” uninsured individuals and families must obtain insurance or face a tax penalty. The premiums for plans purchased on the ACA exchanges go directly into the coffers of the private insurers.

Blue Cross and Blue Shield, one of the nation’s largest insurers, is seeking double-digit increases in many states, including a 23 percent hike in Illinois, 25 percent in North Carolina, 31 percent in Oklahoma, 36 percent in Tennessee, and 54 percent in Minnesota.

The ACA, signed into law in 2010, requires insurance companies to disclose large proposed increases in premiums, and increases of 10 percent or more must be made public and are subject to review under federal law. However, there is no mechanism to rein in the rate hikes if state insurance commissions approve them.

In cynical comments made last week in an appearance in Tennessee, Barack Obama said consumers should put pressure on state insurance regulators to examine the rate requests carefully. “My expectation,” he said, “is that they’ll come in significantly lower than what’s being requested.”

In one example of the opposite result, Oregon Insurance Commissioner Laura N. Cali has just approved rate increases for companies that cover more than 220,000 people. Moda Health Plan, with the state’s largest enrollment, received a 25 percent increase; LifeWise, the second-largest Oregon plan, received a 33 percent hike.

In some cases, state insurance commissions have already granted insurance hikes in excess of those requested by the insurers. In Oregon, Health Net requested increases averaging 9 percent and was granted increases averaging 34.8 percent. Another insurer in the state, Health Co-op, requested a 5.3 percent increase and the state approved a 19.9 percent hike.

Insurers cite the fact that new customers enrolling in ACA plans have turned out to be sicker than expected, which leaves the insurers with a more unhealthy pool of insured customers, requiring them to increase premiums. This is hardly a shocking development, as significant numbers of younger, healthier people have chosen to remain uninsured and risk the cost of getting sick and needing medical care that would have to be paid out-of-pocket.

However, these young people are not simply paying Russian roulette with their health. The driving force behind the decision of many not to enroll in coverage—including the so-called young invincibles, and many workers—is the economic reality that they cannot afford the premiums. To add injury to insult, they face the prospect of being both uninsured and paying tax penalties under the Affordable Care act. For individuals, these penalties for the uninsured were $95 in 2014, rose to $325 in 2015, and will increase to $695 in 2016.

Another factor contributing to the increased pool of unhealthy customers is a policy adopted by the Obama administration in late 2013 that allowed people to keep insurance plans that did not meet new federal standards, including a set of required coverage, including wellness checks and some screenings. The exemption was a political move on the president’s part to make good on his earlier statements that “If you like your plan, you can keep it.”

Customers may also be required to switch plans in order to keep premium costs down. The Kaiser Family Foundation analyzed 2016 premium changes in 10 states and the District of Columbia where the group found complete data for all insurers for the lowest- and second-lowest cost “silver” plans.

For example, Kaiser found that in Seattle, Washington, an unsubsidized person enrolled in the second-lowest silver plan offered by BridgeSpan in 2015 would see a 12.6 percent premium increase if the individual stayed in the same plan, but would pay 10.1 percent less if the person switched to a similar plan offered by Ambetter.

Kaiser found that those switching plans to save money would potentially have to switch doctors and hospitals as well, and that staying with one’s plan also did not guarantee maintaining a provider network. The entire framework of the ACA is thus skewed to the whims of the insurers, and customers are required to scramble each year to determine their selection.

Health and Human Services (HHS) Secretary Sylvia Burwell told the Times, “You have a marketplace where there is competition and people can shop for the plan that best meets their needs in terms of quality and price.”

The HHS secretary did not mention that the most affordable “bronze” plans come with deductibles in excess of $5,000, which means that coverage for all but “essential” medical services do not kick in until the deductible is paid out-of-pocket. This means that despite being insured, many people will go without health care for themselves or their children, resulting in needless suffering and deaths.

Some of the premium increase requests by the private insurers are staggering. Blue Cross and Blue Shield of New Mexico has requested rate hikes averaging 51 percent for its 33,000 enrollees. Scott & White Health Plan in Texas is seeking a 32 percent rate increase. In a ludicrous statement to the New York Times, Scott & White’s CEO Marinan R. Williams said that the rate hike requests show that “there was a real need for the Affordable Care Act.”

Arches Health Plan, which covers about a quarter of the people insured through Obamacare in Utah, says it collected premiums of $39.7 million in 2014, but had claims of $56.3 million in 2014. The insurer has requested rate increases averaging 45 percent for 2016.

The Obama administration has touted a provision of Obamacare that requires insurers to spend at least 80 percent of premiums on medical care and related activities. How this works out in reality, however, is that if the profit margins are not to the insurers’ liking, they request premium increases to generate the revenue to pay stockholder dividends and the bloated salaries of insurance company executives.

The CEOs of the top five for-profit health insurance companies—Aetna, Anthem, Cigna, Humana and UnitedHealth—all took home at least $10 million in 2014, according to filings with the Securities and Exchange Commission. Executive compensation ranged from $10.1 million for Humana CEO Bruce D. Broussard to more than $15 million for Aetna CEO Mark Bertolini.

In the latest round of mergers as a byproduct of Obamacare, Aetna Inc. and Humana Inc. announced last week they had reached a deal to merge, creating an insurance company valued at $37 billion. If approved by the government and carried through, the insurer would become the nation’s second largest, covering more than 10 percent of the US population.

 

http://www.wsws.org/en/articles/2015/07/06/prem-j06.html

It Is Expensive to Be Poor

Minimum-wage jobs are physically demanding, have unpredictable schedules, and pay so meagerly that workers can’t save up enough to move on.


Binita Pradham is a single mother who runs a food business and raises her 4-year-old son. (Barbara Reis)

Fifty years ago, President Lyndon B. Johnson made a move that was unprecedented at the time and remains unmatched by succeeding administrations. He announced a War on Poverty, saying that its “chief weapons” would be “better schools, and better health, and better homes, and better training, and better job opportunities.”

So starting in 1964 and for almost a decade, the federal government poured at least some of its resources in the direction they should have been going all along: toward those who were most in need. Longstanding programs like Head Start, Legal Services, and the Job Corps were created. Medicaid was established. Poverty among seniors was significantly reduced by improvements in Social Security.

Johnson seemed to have established the principle that it is the responsibility of government to intervene on behalf of the disadvantaged and deprived. But there was never enough money for the fight against poverty, and Johnson found himself increasingly distracted by another and deadlier war—the one in Vietnam. Although underfunded, the War on Poverty still managed to provoke an intense backlash from conservative intellectuals and politicians.

In their view, government programs could do nothing to help the poor because poverty arises from the twisted psychology of the poor themselves. By the Reagan era, it had become a cornerstone of conservative ideology that poverty is caused not by low wages or a lack of jobs and education, but by the bad attitudes and faulty lifestyles of the poor.

Picking up on this theory, pundits and politicians have bemoaned the character failings and bad habits of the poor for at least the past 50 years. In their view, the poor are shiftless, irresponsible, and prone to addiction. They have too many children and fail to get married. So if they suffer from grievous material deprivation, if they run out of money between paychecks, if they do not always have food on their tables—then they have no one to blame but themselves.

In the 1990s, with a bipartisan attack on welfare, this kind of prejudice against the poor took a drastically misogynistic turn. Poor single mothers were identified as a key link in what was called “the cycle of poverty.” By staying at home and collecting welfare, they set a toxic example for their children, who—important policymakers came to believe—would be better off being cared for by paid child care workers or even, as Newt Gingrich proposed, in orphanages.
Welfare “reform” was the answer, and it was intended not only to end financial support for imperiled families, but also to cure the self-induced “culture of poverty” that was supposedly at the root of their misery. The original welfare reform bill—a bill, it should be recalled, which was signed by President Bill Clinton—included an allocation of $100 million for “chastity training” for low-income women.

The Great Recession should have put the victim-blaming theory of poverty to rest. In the space of only a few months, millions of people entered the ranks of the officially poor—not only laid-off blue-collar workers, but also downsized tech workers, managers, lawyers, and other once-comfortable professionals. No one could accuse these “nouveau poor” Americans of having made bad choices or bad lifestyle decisions. They were educated, hardworking, and ambitious, and now they were also poor—applying for food stamps, showing up in shelters, lining up for entry-level jobs in retail. This would have been the moment for the pundits to finally admit the truth: Poverty is not a character failing or a lack of motivation. Poverty is a shortage of money.

For most women in poverty, in both good times and bad, the shortage of money arises largely from inadequate wages. When I worked on my book, Nickel and Dimed: On (Not) Getting By in America, I took jobs as a waitress, nursing-home aide, hotel housekeeper, Wal-Mart associate, and a maid with a house-cleaning service. I did not choose these jobs because they were low-paying. I chose them because these are the entry-level jobs most readily available to women.

What I discovered is that in many ways, these jobs are a trap: They pay so little that you cannot accumulate even a couple of hundred dollars to help you make the transition to a better-paying job. They often give you no control over your work schedule, making it impossible to arrange for child care or take a second job. And in many of these jobs, even young women soon begin to experience the physical deterioration—especially knee and back problems—that can bring a painful end to their work life.
I was also dismayed to find that in some ways, it is actually more expensive to be poor than not poor. If you can’t afford the first month’s rent and security deposit you need in order to rent an apartment, you may get stuck in an overpriced residential motel. If you don’t have a kitchen or even a refrigerator and microwave, you will find yourself falling back on convenience store food, which—in addition to its nutritional deficits—is also alarmingly overpriced. If you need a loan, as most poor people eventually do, you will end up paying an interest rate many times more than what a more affluent borrower would be charged. To be poor—especially with children to support and care for—is a perpetual high-wire act.

Most private-sector employers offer no sick days, and many will fire a person who misses a day of work, even to stay home with a sick child. A nonfunctioning car can also mean lost pay and sudden expenses. A broken headlight invites a ticket, plus a fine greater than the cost of a new headlight, and possible court costs. If a creditor decides to get nasty, a court summons may be issued, often leading to an arrest warrant. No amount of training in financial literacy can prepare someone for such exigencies—or make up for an income that is impossibly low to start with. Instead of treating low-wage mothers as the struggling heroines they are, our political culture still tends to view them as miscreants and contributors to the “cycle of poverty.”

If anything, the criminalization of poverty has accelerated since the recession, with growing numbers of states drug testing applicants for temporary assistance, imposing steep fines for school truancy, and imprisoning people for debt. Such measures constitute a cruel inversion of the Johnson-era principle that it is the responsibility of government to extend a helping hand to the poor. Sadly, this has become the means by which the wealthiest country in the world manages to remain complacent in the face of alarmingly high levels of poverty: by continuing to blame poverty not on the economy or inadequate social supports, but on the poor themselves.

It’s time to revive the notion of a collective national responsibility to the poorest among us, who are disproportionately women and especially women of color. Until that happens, we need to wake up to the fact that the underpaid women who clean our homes and offices, prepare and serve our meals, and care for our elderly—earning wages that do not provide enough to live on—are the true philanthropists of our society.

 

Microdosing: A New, Low-Key Way to Use Psychedelics

Some of the most surprising people are using LSD and other psychedelics in extremely low doses, and the results are most interesting.

Photo Credit: agsandrew / Shutterstock.com

At the fifth annual Horizons: Perspectives on Psychedelics conference in New York City in October 2011, pioneering psychedelic researcher Dr. James Fadiman solidified his reemergence as a leading researcher of and advocate for psychedelic substances. Fadiman had done groundbreaking research with LSD up until the very day it was federally banned in 1966, but after that, he retreated into a life of quiet conventionality—at least on the surface.

While Fadiman disappeared himself from the public eye for decades, he never did give up him interest in and enthusiasm for psychedelics. A year before appearing at Horizons, he published his life’s work, The Psychedelic Explorer’s Guide, an amazing compendium of hallucinogenic lore, as well as a user’s manual for would-be psychonauts.

The book examined the primary uses for psychedelics, such as spiritual enlightenment at high doses and improvements in creativity at smaller ones. It also addressed a lesser-known but increasingly popular phenomenon: microdosing.

Microdosing refers to taking extremely small doses of psychedelics, so small that the affects usually associated with such drugs are not evident or are “sub-perceptual,” while going about one’s daily activities. It’s being done by anyone from harried professionals to extreme athletes to senior citizen businesswomen, and they’re claiming serious benefits from it.

To trip brains (or have a transcendental experience) on LSD, a dose of 400 micrograms or more is called for; to explore your inner self, take 200 micrograms; for creative problem solving, try 100 mikes; but for microdosing, take only 10 to 15 micrograms. Similar microdoses for other psychedelics would include 0.2-0.5 grams of dried mushrooms (about one-fifth the normal dose) or about 50-75 micrograms of mescaline.

At that Horizons conference, as reported by Tim Doody in a fascinating profile of Fadiman, the bespectacled 70-year-old at one point asked his audience “How many of you have heard about microdosing?” A couple of dozen hands went up. “Whoa,” he exclaimed.

He explained that, beginning in 2010, he had been doing a study of microdosing. Since research with LSD remains banned, he couldn’t do it in a lab, but had instead relied on a network of volunteers who administered their own doses and reported back with the results. The subjects kept logs of their doses and daily routines, and sent them via email to Fadiman. The results were quite interesting, he said.

“Micro-dosing turns out to be a totally different world,” he explained. “As someone said, the rocks don’t glow, even a little bit. But what many people are reporting is, at the end of the day, they say, ‘That was a really good day.’ You know, that kind of day when things kind of work. You’re doing a task you normally couldn’t stand for two hours, but you do it for three or four. You eat properly. Maybe you do one more set of reps. Just a good day. That seems to be what we’re discovering.”

Study participants functioned normally in their work and relationships, Fadiman said, but with increased focus, emotional clarity, and creativity. One physician reported that microdosing put him “in touch with a deep place of ease and beauty.” A singer reported being better able to hear and channel music.

In his book, a user named “Madeline” offered this report: “Microdosing of 10 to 20 micrograms (of LSD) allow me to increase my focus, open my heart, and achieve breakthrough results while remaining integrated within my routine. My wit, response time, and visual and mental acuity seem greater than normal on it.”

These results are not yet peer-reviewed, but they are suggestive.

“I just got a report from someone who did this for six weeks,” Fadiman said. “And his question to me was, ‘Is there any reason to stop?’”

It isn’t just Fadiman acolytes who are singing the praises of microdosing. One 65-year-old Sonoma County, California, small businesswoman who had never heard of the man told AlterNet she microdosed because it made her feel better and more effective.

“I started doing it in 1980, when I lived in San Francisco and one of my roommates had some mushrooms in the fridge,” said the woman, who asked to remain anonymous. “I just took a tiny sliver and found that it made me alert and energized all day. I wasn’t high or anything; it was more like having a coffee buzz that lasted all day long.”

This woman gave up on microdosing when her roommate’s supply of ‘shrooms ran out, but she has taken it up again recently.

“I’m very busy these days and I’m 65, so I get tired, and maybe just a little bit surly sometimes,” she admitted. “So when a friend brought over some chocolate mushrooms, I decided to try it again. It makes my days so much better! My mood improves, my energy level is up, and I feel like my synapses are really popping. I get things done, and I don’t notice any side-effects whatsoever.”

She’s not seeking visionary experiences, just a way to get through the day, she said.

In an in-depth post on the High Existence blog, Martijn Schirp examined the phenomenon in some detail, as well as describing his own adventure in microdosing:

“On a beautiful morning in Amsterdam, I grabbed my vial of LSD, diluted down with half high grade vodka and half distilled water, and told my friend to trust me and open his mouth. While semi-carefully measuring the droplets for his microdose, I told him to whirl it around in his mouth for a few minutes before swallowing the neuro-chemical concoction. I quickly followed suit,” Schirp wrote. “We had one of the best walking conversations of our lives.”

James Oroc, author of Tryptamine Palace: 5-MeO-DMT and the Sonoran Desert Toad, exposed another realm where microdosing is gaining popularity. In a Multidisciplinary Association for Psychedelic Studies monograph titled“Psychedelics and Extreme Sports,” Oroc extolled the virtues of microdosing for athletes. Taking low-dose psychedelics improved “cognitive functioning, emotional balance, and physical stamina,” he wrote.

“Virtually all athletes who learn to use LSD
at psycholytic [micro] dosages believe that the use of these compounds improves both their stamina and their abilities,” Oroc continued. “According to the combined reports of 40 years of use by the extreme sports underground, LSD can increase your reflex time to lightning speed, improve your balance to the point of perfection, increase your concentration until you experience ‘tunnel vision,’ and make you impervious to weakness or pain. LSD’s effects in these regards amongst the extreme-sport community are in fact legendary, universal, and without dispute.”

Even the father of LSD, Albert Hofman seems to have been a fan. In his book, Fadiman notes that Hofmann microdosed himself well into old age and quoted him as saying LSD “would have gone on to be used as Ritalin if it hadn’t been so harshly scheduled.”

Psychonauts, take note. Microdosing isn’t going to take you to another astral plane, but it may help you get through the day. For more infomation on the microdosing experience, dig into the links up-story, as well as the Reddit user forum on microdosing. Surprisingly enough, the vaults of Erowid, that repository of drug user experiences, returned only one entry about microdosing, from someone who appears to have been a subject in the Fadiman microdosing experiments.

And, of course, if you want to try this, you have to obtain some psychedelics. They’re illegal, which doesn’t mean they aren’t around. An increasing number of people are finding them on the dark web; others obtain them the old-fashioned way: from within their own communities. Those who are really interested will get to work.

 

Phillip Smith is editor of the AlterNet Drug Reporter and author of the Drug War Chronicle.

 

http://www.alternet.org/drugs/microdosing-new-low-key-way-use-psychedelics?akid=13216.265072.1ZXgbS&rd=1&src=newsletter1037865&t=13

Why Are Rates of Suicide Soaring Across the Planet?

PERSONAL HEALTH
dt_140827_rope_hang_suicide_depression_800x600

The figures of both attempted suicides and committed suicides are increasing.

A friend recently asked to meet for coffee. ‘I’ve had some more bad news,’ his text said. A ‘fifty something’ year old friend had taken his own life the day before. Jack had hanged himself from a tree in a public park on the outskirts of London; it was his fourth attempt. He had four children. This was the second, middle-aged, male friend to have committed suicide within six months.

Their stories are far from unique. Suicide occurs everywhere in the world to people of all age groups, from 15 to 70 years. The World Health Organisation (WHO) says that almost one million people commit suicide every year, with 20 times that number attempting it, and the numbers are rising. Methods vary from country to country: in the USA, where firearms litter the streets, 60% of people shoot themselves; in India and other Asian countries, as well as South Africa, taking poison, particularly drinking pesticides, is the most popular choice. In Hong Kong, China and urban Taiwan, WHO records that a new method, “charcoal-burning suicide” has been recorded. Drowning, jumping from a height, slashing wrists and hanging (the most popular form in Britain, the Balkans and Eastern European countries) are some of the other ways desperate human beings decide to end their lives.

Stigma and Under-reporting of Suicides

According to WHO, 1.5% of worldwide deaths were caused by suicide in 2012, making it the third highest cause of death in the World, and this is just those deaths which have been confirmed as suicide. WHO admits that the availability and quality of data is poor, with only 60 Member States providing statistics “that can be used directly to estimate suicide rates.” Many suicides, they say, “are hidden among other causes of death, such as single car, single driver road traffic accidents, un-witnessed drowning’s and other undetermined deaths.” These are just some of the many factors that make accurately assessing the numbers who take their own lives problematic. In countries where social attitudes, or religious dogma, shroud suicide in a stigma of guilt (Sub-Saharan Africa, where suicide is rarely if ever discussed or admitted, for instance), suicide may be hidden and go un-reported; so too in countries where suicide is still regarded as a criminal act: Hungary for example, where attempted suicide carries a prison sentence of five years, or Japan where it is illegal to commit suicide. North Korea, where relatives of a person committing suicide are penalised; Ireland, where self-harm is not generally regarded as a form of attempted suicide; Singapore, where suicide remains illegal and attempted suicide can result in imprisonment; or Russia, where the rate of teenage suicides is three times the world average and where those attempting suicide can be committed to a psychiatric hospital. All of which are pretty strong reasons for hiding suicide attempts and concealing suicide as the cause of death, as well as deterring people from discussing suicidal thoughts.

Whatever the precise number of total deaths by suicide – and all the indications are that it is a good deal higher than WHO says – what is clear is that suicide is a major social issue. The figures of both attempted suicides and committed suicides are increasing; it needs to be openly discussed, the causes understood and more support provided. In the last 45 years, WHO state that suicide rates have increased by 60%, and unless something marvellous happens that drastically changes the environment in which we are living, they predict that by 2020 the rate of death will have doubled – from one suicide every 40 seconds, to someone, somewhere in the world taking his/her life every 20 seconds!

Rates of suicide and gender ratios vary from country to country and region to region, but overwhelmingly men are more at risk than women. WHO found that 75% of global suicides occurred in low- and middle-income countries, with 30% of all suicides occurring in China and India where suicide was only de-criminalised in 2014. Eastern European countries, such as Lithuania and the Russian Federation, recorded the highest numbers of suicides, the Eastern Mediterranean Region, Central and South America (Peru, Mexico, Brazil and Colombia) the lowest. And although suicide rates worldwide have traditionally been highest amongst elderly men, young people – that’s 15-29 – year olds, are now the group at the greatest risk in a third of all countries. Suicide, WHO states, is the “leading cause of death in this age group after transport and other accidents and assault for males,” with very little gender difference – “9.5% in males and 8.2% in females.”

Throughout western societies around three times the number of men die by suicide than women, and over 50s are particularly vulnerable. In Britain men account for 80% of all suicide cases (with an average of 13 men a day killing themselves), 40-44 year olds are particularly at risk here. In “low- and middle-income countries”, WHO records, “the male-to-female ratio is much lower [than more developed countries] at 1.5 men to each woman.” Surprisingly, in the USA, where four times the number of men die from suicide than women, according to The Centre of Disease Control and Prevention, women are more likely to attempt it. The statistical gender gap in western societies may in small part be caused, The Samaritans think, by the different suicide methods used by men and women. Leading to the fact that in some cases “the intent cannot be determined (or assumed) as easily [with women] as in methods more common to males.” This may result, they say, “in more under-reporting of suicidal deaths in females.“

The Causes of Suicide

The specific reasons why people commit suicide are many and varied; ‘mental health issues’ is the umbrella term often cited as the cause. According to researchers at Glasgow University 90% of suicide cases suffer from some form of mental illness. It is an ambiguous phrase though, that explains little, and comforts the bereaved less. It would seem obvious that if someone kills themselves, they are not feeling mentally or emotionally ‘intact’, or ‘good’. ‘I struggled for so long’, ‘I couldn’t cope anymore’, ‘life seemed meaningless’, ‘I felt tremendous anxiety’, and so on, are phrases common to many of us, including those people contemplating, attempting or committing suicide. Perhaps understandably depression is usually mentioned as a cause, but this of course does not mean everyone suffering from depression is at risk of suicide!

The WHO makes clear that whilst suicide rates vary enormously from country to country, differences, “influenced by the cultural, social, religious and economic environments in which people live and sometimes want to stop living..…the pressures of life, that cause extreme emotional distress” and sometimes lead to suicide, “are similar everywhere.”

It is these ‘pressures of life’, that need to be properly understood, what they are, where they come from, the impact they have, and how we can change the structure of society to free humanity from them. Why do we have such damaging ‘pressures of life’? We should not be living in a world that produces such detrimental forces. Something in our world society is terribly wrong when a million or so people kill themselves every year, and where suicide is the second highest cause of death amongst under 20 year olds.

I am not a psychologist, but commonsense would suggest that the ‘sense of self’ must be at the heart of the issue, the volatile central cause. If that ‘sense of self’ is positive, if one feels connected to ‘life’, has structure, purpose and self-belief, feels liked, loved even, then suicide would seem unlikely. If, however, the image of self is negative, of a ‘failure’, unable to ‘fit in’, feeling lost, lacking direction and experiencing social and emotional withdrawal, a fragile sense of self and increasing vulnerability are, it would seem, likely.

Then there are the practical problems we all face of earning a living and paying the rent/mortgage; the more subtle issues – pressures of ‘succeeding’ – economically, socially, in a career, and in ‘love’. The inability – real or perceived – to meet these ‘pressures of life’ creating worry and anxiety – perhaps leading to alcohol or substance abuse – which strengthens social isolation, reinforces the image of failure, weakening self-belief/confidence and strengthening self-loathing. And all this in a world where weakness, particularly in men, is frowned upon; where sensitivity, uncertainty and fragility are to be overcome – ‘toughen up’ is the message, spoken directly or indirectly.

We have little understanding of who and what we are, so we create images, cling to ideological constructs that move us further and further away from our true nature. The ideal image of what it means to be a human being, particularly a man, has become increasingly narrow. Men, especially under 40 year olds, must be decisive, strong and ambitious. Any flowery beliefs – philosophical or religious for example – should be eradicated, or at least hidden, certainly not mentioned in public. Any admission of self-doubt and signs of vulnerability should be completely avoided, and a macho, no-nonsense approach to life adopted and expressed.

Broadly speaking this has become the stereotype of what it is to be a man in the 21st century, and conformity to the pattern is insisted on – via education, peer pressure and the corporate media. Women, particularly young women are expected to meet a similarly, if slightly less constricting, formulated ideal. Both are extremely restrictive, unhealthy images that fit into a worldwide system of societal uniformity, built by, and in the interests of, multinationals (who own everything), facilitated by corporate governments (who lack principles), which is sucking the richness, and diversity out of life. Everyone is expected to want the same things, to wear the same clothes, believe the same propaganda, aspire to the same ideals and behave the same. Every country, city, town and village is seen as a marketplace, every person a consumer to be exploited fully, sucked dry and discarded.

Competition and conformity have infiltrated every area of worldwide society, from education to health care. Everything and everyone is seen as a commodity, to be bought at the lowest price and sold at the highest, financial profit is the overwhelming motive that drives and distorts action. Materialistic values promoting individual success, greed and selfishness saturate the world; ‘values’ that divide and separate humanity, leading to social tension, conflict and illness. Ideals, which are not values in any real sense of the word, which have both fashioned the divisive political-economic landscape in which we live (which has failed the masses and poisoned the planet), and been strengthened by it. Together with the economic system of market fundamentalism which so ardently promotes them, these ‘values’ form, I believe, the basic ingredients in the interwoven set of social factors that cause a great deal of the ‘mental health issues’, which lead those most vulnerable members of our society to commit suicide. Men, women and children who simply cannot cope with the ‘pressures of life’ anymore, who feel the collective and individual pain of life acutely, are disposed towards introspection and find the world too noisy, its values too crude, its demands of ‘strength’ not weakness, ‘success’ not failure, ‘confidence’ not doubt, impossible to meet. And why should they have to meet them, why do these ‘pressures of life’ exist at all?

It is time to build an altogether different, healthier model, a new way of living in which true perennial values of goodness, shape the systems that govern the societies in which we live, and not the corrosive, ideologically reductive corporate weapons of ubiquitous living which are sucking the beauty, diversity and joy out of life. Values of compassion, selflessness, cooperation, tolerance and understanding; we need, as Arundhati Roy puts it, “to redefine the meaning of modernity, to redefine the meaning of happiness,” for we have exchanged happiness for pleasure, replaced love with desire, unity with division, cooperation with competition, and have created a divided society, where conflict rages, internationally, regionally, communally and individually.

Graham Peebles is director of the Create Trust. He can be reached at: graham@thecreatetrust.org

Deadly food poisoning is lurking

A new highly contagious, antibiotic-resistant bug has arrived

A frightening new Scientific American report details the growing threat of shigella across the United States

Deadly food poisoning is lurking: A new highly contagious, antibiotic-resistant bug has arrived
This article was originally published by Scientific American.

Scientific AmericanThe kinds of bacteria that can cause food poisoning lurk all around us. These germs can be especially easy to pick up when traveling internationally as well as in places, such as children’s day cares, which are hard to keep clean. The infections usually clear up on their own but sometimes require hospitalizations and hefty doses of antibiotics to expunge. Unfortunately, the bacteria are becoming increasingly resistant to treatment.

The latest bad news came in April when the U.S. Centers for Disease Control and Prevention reported an outbreak of Shigella sonnei that has become resistant to ciprofloxacin—one of the last remaining medications in pill form that can kill the germ. Since then a Scientific American investigation shows the worrisome strain is still circulating in the U.S. a year after it first emerged.

Shigella bacteria typically cause about 500,000 diarrheal illnesses and 40 deaths in the U.S. every year. Children who are malnourished and people with compromised immune systems are particularly at risk of developing severe cases. Symptoms include diarrhea that is sometimes bloody, fever and abdominal pain, and typically last about a week.

The bacteria occur naturally in the U.S. but, heretofore, people typically caught ciprofloxacin-resistant strains while traveling internationally. In the current outbreak, however, many people who became sick had not recently been out of the country, which proves that the multidrug-resistant bug has now established a firm domestic presence.

The CDC has confirmed 275 cases of ciprofloxacin-resistant shigella across the country from May 2014 to May 2015, according to data obtained exclusively byScientific American (see chart below). Although these figures appear small, they almost certainly represent but a tiny fraction of the true number of ciprofloxacin-resistant cases. Shigella infections are supposed to be reported to the CDC but a lot of people who get sick do not go to the doctor. And those who do are sometimes not tested for the presence of shigella, let alone drug resistance.

Vulnerable populations are some of the hardest hit in this outbreak, including cases linked to a day care center, homeless people in San Francisco and HIV-positive individuals in Philadelphia. As few as 10 shigella germs can cause an infection—making the bacteria virtually undetectable as it quickly spreads in contaminated food and water or from person to person.

Other drugs that the pathogen has overcome in the past include ampicillin, streptomycin and tetracycline. Anna Bowen, a medical officer in the CDC’s Waterborne Diseases Prevention Branch and lead author of the April study, says the CDC has identified some cases in this outbreak that were resistant to all of the oral treatment options currently available. The next line of defense is a broader-spectrum, more expensive antibiotic that must be administered via injection or an intravenous line.

Whereas labs can test for ciprofloxacin resistance, there are currently no standardized tests to identify if a shigella infection is resistant to azithromycin, which is the go-to drug for children. (The U.S. Food and Drug Administration has approved ciprofloxacin only for adults.) “Almost no clinical labs are doing this sort of testing,” Bowen says, “and so patients are being treated kind of blindly since the providers don’t know if azithromycin is an appropriate choice or not.”

Lag time in reporting is another issue. San Francisco, for example, is tracking nearly two times the number of cases that the CDC counts as confirmed for the city—228 cases versus 119. Cora Hoover, director of Communicable Disease Control and Prevention for the San Francisco Department of Public Health, says they have slightly different case definitions because as the city agency on the ground investigating this outbreak they want assurance all possible patients are identified; also it takes so long to confirm a case. Public health officials normally follow up with each patient, and lab tests can take weeks.

It can take around a month to confirm a case of shigellosis is both antibiotic-resistant and part of the same outbreak, though it varies. Generally, once a doctor identifies a shigella infection, he or she reports it to the city or state public health agency and sends a stool sample to the lab to confirm the diagnosis. The lab grows or “cultures” the bacteria and reports its findings back to the doctor and agency in about a week. The health agency then reports the case to the CDC, which tests a selection of cases for antibiotic resistance via the National Antimicrobial Resistance Monitoring System and its national laboratory network, PulseNet. Results from PulseNet’s genetic testing of sample cases can be complete within a couple of weeks.

By the time the full picture of a single case is confirmed, the patient is usually better. Caroline Johnson, director of the Division of Disease Control at Public Health for the City of Philadelphia, says her division usually suspects that a case is part of an outbreak but does not know for sure until the full results are in.

Peter Gerner-Smidt, chief of the CDC’s Enteric Diseases Laboratory Branch and PulseNet, says labs will gradually move away from having to culture bacteria to identify them. As genetic testing becomes cheaper and more accessible, state labs will eventually be able to get that information by determining the whole DNA sequence of each sample. This approach will hopefully reveal antibiotic-resistance more quickly, he says, but it will likely take years before these tests are widely used.

Because of the increasing threat of multidrug-resistant shigella, the CDC and other health agencies recommend doctors only prescribe antibiotics for severe cases. Shigellosis can actually clear up on its own with proper hydration and rest. Preventionis therefore the best weapon for controlling resistant shigella, Bowen says, particularly because the U.S. cannot regulate antibiotic overuse in other countries, but it still affects patients here.

“Problems with antibiotic resistance anywhere are problems with antibiotic resistance everywhere,” she says. “There are no borders when it comes to antibiotic resistance, and we have all got to be vigilant.”

 

 

 

http://www.salon.com/2015/05/19/deadly_food_poisoning_is_lurking_partner/?source=newsletter

“Montage of Heck” captures the contradictions of Kurt Cobain — and the America that shaped him

Smells like doomed genius: 

Yes, it’s Courtney-approved, but this documentary is a moving and powerful portrait of Kurt Cobain’s America

Smells like doomed genius: "Montage of Heck" captures the contradictions of Kurt Cobain -- and the America that shaped him

“Kurt Cobain: Montage of Heck” (Credit: Sundance Institute)

I remember coming to work on the morning Kurt Cobain was found dead, and feeling puzzled that a younger writer at our San Francisco alternative weekly – who would go on to become a prominent newspaper and magazine editor in New York – was so upset that she sat at her desk all day crying. I could psychoanalyze myself at Cobain-like depth, but the reasons I didn’t get it were basically stupid and defensive. Of course I knew Cobain’s music, and I understood that his death was a big story. But I was also deeply committed to my own disillusionment, to never being taken by surprise. I had already been through the first wave of punk rock, the worst years of AIDS, the deaths of a lot of people less famous than him. I would have rejected Cobain’s status as generational icon even more forcefully than he did – which, in retrospect, looks a lot like deep yearning, thinly wrapped in snobbery. His combination of suburban angst, drug addiction and mental-health issues was an old story, wasn’t it? Just another “Rock ‘n’ Roll Suicide,” a song David Bowie wrote in 1972. Nothing to cry about.

Fourteen years later, I was with my kids at a beachfront amusement park when my friend Laura Miller, Salon’s book critic, called to tell me that David Foster Wallace was dead. I got out of the roller coaster line to talk to her – Laura knew Wallace, but I didn’t – and one of the first things to swim into my brain, addled as it was by sunshine and a friend’s grief, was Kurt Cobain. At the time, I understood the connection as a personal commandment to have this experience, complete with all the Cobain-like and Wallace-like ironic introspection it might require; I took it as an edict not to insulate myself against the shared emotion, and potential shared meaning, of this moment of collective mourning. It took longer to see that the linkages between Cobain and Wallace go much deeper than that, and that many other people registered the connection in approximately the same way.

For many viewers of Brett Morgen’s extraordinary HBO documentary “Kurt Cobain: Montage of Heck,” the most fascinating and powerful elements of the film will be found in the intimate home videos shot by Cobain and Courtney Love in the early ‘90s, before and after their daughter Frances was born. (Frances Bean Cobain is an executive producer of the film, and both its remarkable depth and its limitations derive from the fact that it’s an authorized biography, made with the cooperation of Love, Cobain’s parents and various former friends and bandmates.) That footage is absolutely heartbreaking in its depiction of a loving, flawed, high-spirited and essentially normal young family, a long way from the drug-crazed rock-star fiends favored by the tabloids of that not-so-distant era. Yes, rock fans, you do get to see Courtney naked. Impressive as that is, it’s not half as much fun as hearing her ventriloquize baby Frances complaining that her dad’s band are self-indulgent whiners who aren’t as good as Guns N’ Roses. (Footnote for scholars: Cobain’s obsession with GnR frontman Axl Rose is fascinating, but ultimately aren’t they more alike than different?)

But I watched that amazing material with a sense that by that time the die had already been cast. Love and Cobain were famous and their baby, allegedly born addicted to heroin, was famous too. What they were “really like,” as human beings, was irrelevant. As long as they lived they were going to be famous rock ‘n’ roll fuckups, damaged symbols of a damaged generation. For someone with Cobain’s particular set of neuroses, ailments and vulnerabilities, not to mention his philosophical and aesthetic predilections, that might literally be a fate worse than death. I’m not saying that other outcomes, not involving a shotgun blast to the head, were not possible. But there was no easy or painless exit from the prison-house of celebrity available to Kurt Cobain, and he didn’t much like living in it.

Morgen’s title refers both to an extended audio collage Cobain once recorded on cassette tape – just one example of his explosive, unstoppable cultural output – and to the method of the film itself, which assembles an immense trove of public and private material to illustrate a life spent first in obscurity and then in the unbearable spotlight. He has Cobain’s famous notebooks full of lyrics, journal entries, cartoons and momentary observations, of course, but also home movies of his 3rd birthday party, a collection of family snapshots, recordings of early radio interviews and footage of the first Nirvana shows in Aberdeen or Olympia, with a few dozen people in attendance.

He interviews Wendy O’Connor, Cobain’s overly loquacious mother, Don Cobain, his monosyllabic father, and Tracy Marander, who was Cobain’s first serious girlfriend and the first woman he lived with. (He was a total deadbeat, from the sound of things, but Marander doesn’t seem to regret working for a living while he played guitar and watched TV. Here she is in a movie, all these years later.) Oh, and there’s music – a lot of it, the famous tracks and a bunch of lesser-known ones. You will indeed hear “Smells Like Teen Spirit,” in a number of versions and a variety of contexts – and when we finally get the actual Nirvana recording over the closing credits, well, I’m not saying I cried in grief and joy and anger but I’m not saying I didn’t.

Rather than trying to describe all these people who have lived on and gotten older, and who now find themselves sitting on their couches struggling to describe or explain a guy they used to know who became very famous and then died, I would say that “Montage of Heck” paints a bitter but compassionate portrait of the downscale white America that shaped Kurt Cobain. He was born in 1967, which surely felt more like 1957 in Aberdeen, Washington, than it did in the tumultuous climate of big cities and college towns. O’Connor says she remembers Aberdeen as a wonderful place to raise a family, and that her kids had a happy childhood. Not much later in the film we hear Cobain describe Aberdeen, in a recorded conversation with an old friend, as an “isolated hellhole” dominated by moralistic Reaganite conformity. You don’t get the feeling that teenage Kurt was an easy kid to live with, or someone who naturally made the best out of difficult circumstances. But his inarticulate sense that the society around him was fundamentally inauthentic, and his yearning to transform it or destroy it, molded one of the last and greatest voices of what Casey Kasem used to call the “rock era.”

Teenage alienation and rebellion is of course not a new phenomenon, and is not unique to the depressed lumber towns of the Pacific Northwest (although I imagine that lent it a particular coloration). In the animations Morgen’s team has created to illustrate Cobain’s audio montages, we witness the highly familiar quality of Cobain’s childhood and teen years: His parents were unhappy and got divorced, he smoked a lot of pot and had frustrating sexual experiences, he was an intelligent and creative kid who found school to be soul-deadening and found some release in loud music. There may be no comprehensible answer to the question of why he responded so keenly to these stimuli, which were applied with equal force to millions of other kids of the downward-trending ‘70s and ‘80s. From an early age, Kurt Cobain yearned to make memorable art, escape his surroundings and become famous, and from an early age he contemplated ending his life, with the kind of obsessive, repeated “jokes” that are impossible to gauge from the outside.

If Cobain and Wallace worked in different mediums and different registers, and emerged from different sectors of middle-class white suburbia – indeed, you can only call Cobain’s background “middle class” under the postwar convention that all white Americans who have jobs and cars belong to that class by definition – there is no mistaking the kinship of their unnaturally keen responses. They were 1960s babies who grew up amid Vietnam and Watergate and the gas crisis and Whip Inflation Now and Jimmy Carter in his cardigan talking about our “national malaise,” and who were teenagers and young adults as that malaise and turmoil turned to amnesia and denial and the suicidal, delusional counterrevolution of the Reagan years. America has not recovered from the cultural and political whiplash of those years and probably never will.

All of us who lived through that period bear the scars, and we have all tried to react to it and push forward as best we can. Of course Wallace is not the only important writer of their generation, nor is Cobain the only memorable singer-songwriter. But they are joined by the intensity of their response – “Nevermind” and “Infinite Jest” are highly singular works in totally different traditions, but I think they represent the same scale of achievement and possess a similar cultural resonance – and by the way they touched a deep well of passion, hunger and unease that transcended demographic or generational clichés. It’s by no means irrelevant that they were both white heterosexual men who were deeply aware of the problematic nature of the Great Man archetype, and committed to addressing that issue in their work and their private lives. And it’s certainly not irrelevant that they became overwhelmed by the vicious contradictions of fame in our era — or, to put it more simply, that they could not escape the private demons of mental illness and drug addiction and ended by killing themselves.

As I noted earlier, “Montage of Heck” was made with the cooperation of Courtney Love and several other relatives or intimate friends of Kurt Cobain. (The most prominent omission is Nirvana drummer Dave Grohl.) Among other things, that means the movie does not traffic in any of the pathological conspiracy theories around Cobain’s death, or indeed depict his death in any way. It may whitewash some details of Love and Cobain’s relationship – I wouldn’t know, and don’t especially care – and it certainly depicts the reporters who raked up dirt on the couple, especially Lynn Hirschberg of Vanity Fair, as unscrupulous vultures.

I would agree that the media’s vampirical obsession with the Kurt-and-Courtney story was not journalism’s finest hour, and that it reflected profound anxiety about the youth-culture moment they were seen to represent. But that’s too large a problem to unpack here; I think it’s best to take the Courtney-centric area of the film with a grain of salt and draw your own conclusions. Those are minor issues in a masterful and often deeply moving portrait of a volatile American genius, a portrait that goes far beyond one man, one family and one rain-sodden small town. It depicts the society that nurtured and fed that genius, and that made his unlikely creative explosion possible, as being the same environment that poisoned him — and suggests that the rise and fall were inextricably connected. Kurt Cobain was a canary in the coalmine, as was David Foster Wallace. You and I are still in it, and it’s getting harder to breathe.

“Kurt Cobain: Montage of Heck” opens this week in Los Angeles, New York and Seattle, and then premieres May 4 on HBO.

 

http://www.salon.com/2015/04/23/smells_like_doomed_genius_montage_of_heck_captures_the_contradictions_of_kurt_cobain_and_the_america_that_shaped_him/?source=newsletter

Follow

Get every new post delivered to your Inbox.

Join 1,696 other followers