US health insurance deductibles rise sharply, far outpacing wage gains


By Kate Randall
25 September 2015

The steady increase in health insurance deductibles paid by US workers over the last five years is more than six times the average wage increase, according to a new analysis released Tuesday by the Kaiser Family Foundation.

Businesses are steadily raising the amount employees must pay for health care before their coverage kicks in, resulting in workers and the families foregoing medical care because they cannot afford it. The Affordable Care Act (ACA) is a major contributing factor to this phenomenon through its excise tax on “lavish” health plans set to go into effect in 2018.

Kaiser, a health policy research group that tracks employer and other health insurance plans and benefits, calculates that insurance deductibles have risen more than six times faster than workers’ wages since 2010. Four of five workers with employer-sponsored health insurance now pay a deductible.

Both the share of workers with deductibles and the size of those deductibles have increased sharply in the last five years, according to the study. Deductible amounts have risen by an average 67 percent during this same period, while premiums rose a comparatively moderate 24 percent as more employers shifted the cost of health insurance onto workers.

The 67 percent deductible hike has dwarfed the average rise in workers’ wages, which have increased a miserable 10 percent in the wake of the 2008 financial crisis, according to the Kaiser study.

The 17th annual Kaiser/Health Research & Educational Trust analyzed the responses of nearly 2,000 small and large employers on health insurance coverage, costs and deductibles as well as actions businesses have taken in relation to the ACA, popularly known as Obamacare.

Last year, 98 percent of large firms (200 workers or more) offered insurance coverage compared to 47 percent of the smallest firms (three to nine workers) offering coverage. The 98 percent statistic, however, is deceptive as what constitutes “coverage” is being continually downgraded by the raising of deductibles and other out-of-pocket costs.

The survey found that the average annual premium for single coverage is $6,251, of which workers pay on average $1,071. For a family the average premium is $17,545, with workers contributing on average $4,955.

In addition to their portion of the premium workers then must cover the deductible, which must be paid before coverage for all but certain “essential” tests and services can begin. One in five workers has a deductible of $2,000 or more.

The Kaiser survey also provides an early look at employers’ response to the Obamacare tax on higher-cost health plans, often referred to as the “Cadillac tax.” Beginning in 2018 a 40 percent tax will be levied on individual plans with premiums exceeding $10,200 or family plans costing more than $27,500.

This tax is one of the key features of Obama’s health care legislation and is aimed at gutting health care for those receiving coverage through their employers, or close to half of the US population.

Fifty-three percent of large employers surveyed have conducted an analysis to determine whether any of their plans exceed the Cadillac tax thresholds, with about one in five in the group saying their plans with the largest enrollment will exceed it. Thirteen percent of large firms offering health benefits say they have already made changes to their plans to avoid stepping over the limit, while 8 percent say they have switched to a lower cost health plan.

“Our survey finds most large employers are already planning for the Cadillac tax, with some already taking steps to minimize its impact in 2018,” Kaiser study lead author Gary Claxton said in a statement. “Those changes likely will shift costs to workers, but exactly how and how much will vary for individual workers.”

Steps already taken by employers to lower health care costs include eliminating a hospital or a health system from the network on one of their plans (9 percent of firms) and offering a “narrow network” plan, one generally considered more limited than the standard HMO network (7 percent).

The Kaiser study shows that employers are waging a two-pronged attack on employee health benefits. Workers are required to pay steadily rising out-of-pocket expenses in the form of skyrocketing deductibles, while the coverage they are receiving deteriorates. The specific aim of the costs borne by workers is to discourage them to seeking treatment for themselves and family members.

Some of the “creative” solutions encouraged by employers to help workers self-ration care include offering plans with access to a doctor or nurse over the telephone or Skype as an alternative to a trip to the doctor or emergency room. Employers are also offering online tools to show them the cost of a particular test or doctor visit, the distinct possibility being that sticker shock will keep them from seeking treatment.

The shift being effected in employer-sponsored health care is working in tandem with the central component of Obamacare, the “individual mandate.” Under this provision, individuals and families without insurance through an employer or a government program such as Medicare or Medicaid must obtain insurance or pay a tax penalty.

People can shop for insurance from private insurance companies on the health care exchanges set up by the ACA, where some low- and middle-income individuals receive modest subsidies to offset premium prices.

Customers buying coverage on the federal and other state exchanges have already discovered that many of the most affordable Bronze plans come with deductibles in excess of $5,000. Despite these high out-of-pocket costs, private insurers are already seeking and receiving double-digitpremium rate hikes from state insurance commissions.

Workers receiving coverage through either their employers or the Obamacare exchanges find themselves in the increasingly untenable position of trying to cover mounting health care costs for their families while their wages stagnate or decline in real terms.

Rave SWAT Teams

Cops Cracking Down on ‘Evil’ People Who Dance Late at Night

The motion is a reactionary step following the death of two people at last month’s HARD Summer music festival.

Los Angeles, Ca — Last week, local politicians in Los Angeles voted unanimously to create a new police task force that will be entirely focused on electronic dance music events. Earlier this summer, Los Angeles County council members proposed banning these types of events entirely. While that option is still on the table for them, they are now moving to crack down on these events until they are able to initiate a complete ban.

The recent proposal to create a police task force was put forward by county supervisors Hilda Solis and Michael Antonovich. The motion reads “Ultimately, in the interest of public safety, a ban of electronic music festivals at county-owned properties remains a possibility that will continue to be evaluated.”

“I want to emphasize that our efforts around this motion, above all, are about the health and safety of those attending these events. No lives should be lost while attending any music event,” Solis said in a statement.

The motion is a reactionary step following the death of two people at last month’s HARD Summer music festival at the Pomona Fairgrounds.

While deaths at events are a concern, they are largely due to the prohibition of these drugs, which makes them more dangerous. To make matters worse, the zero tolerance policy at events prevents drug users from getting the help they need when something does go wrong.

The development of a rave task force is reminiscent of the fear-mongering propagated about raves in the late 1990s.

In April of 2003, the government passed a law that everyone could agree on, the Amber Alert Bill. The Amber Alert is a notification system that sends warnings about missing and abducted children.

At face value, this seemed like something that was completely positive, and when it comes to rescuing abducted children, the Amber Alert system has surely saved many lives. However, the piece of legislation that put this system into effect is a perfect example of how the government is able to pass unpopular laws, by attaching them to popular bills.

In the case of the legislation that set up the Amber Alert system, there were also completely unrelated issues covered in the bill. For example, hidden deep within the bill was one of Joe Biden’s pet projects, the RAVE ACT, a law that imposes legal penalties on hosts and participants of late night dance parties.

According to the Wikipedia entry for the RAVE ACT:

On Thursday (April 10, 2003) the Senate and House passed the Illicit Drug Anti-Proliferation Act (formerly known as the RAVE Act) as an attachment to the child abduction-related AMBER Alert Bill. The language of the original act was changed slightly before the bill was passed without public hearing, debate or a vote.

Festivals and other events are not to blame for overdoses, or other personal decisions that attendees make on their own. This is especially important to consider when the events host anywhere between 10,000 and 100,000 people.

In an area with that many people, as populated as some towns are, it is inevitable that a wide variety of situations can pop up. In fact, any large event that hosts so many people see occasional deaths. Due to the large volume of people, the chances increase that something will go wrong somewhere. This goes for sporting events to parades and other types of events that are considered wholesome and family-friendly.

Some other factors to consider are the many unintended consequences of the drug war, which causes drugs to be more dangerous, and limits harm prevention policies that could be put into place to prevent overheating and drug overdoses.

John Vibes is an author, researcher and investigative journalist.

Doctors protest high prices of cancer drugs in US


By Brad Dixon
14 September 2015

In a commentary appearing in the journal Mayo Clinic Proceedings in July, a group of 118 oncologists called for measures to address the skyrocketing prices of cancer drugs, which have increased by five- to ten-fold over a period of 15 years. “It’s time for patients and their physicians to call for change,” said the commentary’s lead author, Ayalew Tefferi, a hematologist at the Mayo Clinic.

The second signatory of the commentary, Dr. Hagop Kantarjian, chairman of MD Anderson Cancer Center’s leukemia department, has been leading a campaign against the high costs of cancer drugs. In April 2013, he publishedan editorial in the medical journal Blood, that was signed by 100 other cancer experts, which argued that the prices of new chronic myelogenous leukemia (CML) drugs “are too high, unsustainable, may compromise access of needy patients to highly effective therapy, and are harmful to the sustainability of our national healthcare systems.”

The Blood editorial noted that 11 of the 12 drugs approved by the FDA for cancer indications in 2012 were priced above $100,000 per year. In 2014, theMayo commentary observes, all of the newly approved cancer drugs were priced above $120,000.

The Mayo commentary cites an article recently published in the Journal of Economic Perspectives which examined 58 anticancer drugs approved between 1995 and 2013. The article’s authors found that the average launch price of anticancer drugs, adjusting for inflation and health benefits, rose by $8,500 each year—an annual increase of 10 percent.

The high drug prices, the authors write, means that cancer patients “have to make difficult choices between spending their incomes (and liquidating assets) on potentially lifesaving therapies or foregoing treatment to provide for family necessities (food, housing, education).”

The Mayo commentary comes on the heels of the 2015 annual meeting of the American Society of Clinical Oncology (ASCO) held in Chicago where the high drug prices were criticized during the meeting’s plenary session by Dr. Leonard Saltz, chief of gastrointestinal oncology at Memorial Sloan Kettering Cancer Center. The meeting, attended by an estimated 25,000 doctors, is sponsored by the pharmaceutical industry, making the remarks unusual for the venue.

“Cancer-drug prices are not related to the value of the drug,” Saltz told theWall Street Journal. “Prices are based on what has come before and what the seller believes the market will bear.”

Novartis’s cancer drug Gleevec, for example, which pulls in nearly $5 billion a year for the drug company, has more than tripled in price since it was first approved in 2001—jumping from a wholesale price of $2,624 a month to $9,210. Even with health insurance and a full-time job, patient co-pays for Gleevec can be as high as $2,000 a month.

As Forbes staff writer Mathew Herper commented, the price increase of Gleevec “happened partly because competition increased and, as new drugs entered the market at higher prices, Novartis raised its price too. The normal law of supply and demand worked in reverse.”

The reform measures proposed by the doctors, however—such as FDA determinations of fair prices, allowing Medicare to negotiate drug prices, and patent reform—have little chance of ever being implemented and, moreover, do not address the primary cause of the surge in drug prices: the subordination of healthcare, medicine, and drug discovery to the profit system.

This was made clear by the response of the pharmaceutical industry’s trade association PhRMA to the Mayo commentary, which took on the character of naked extortion: The pharmaceutical industry, which has higher profit margins than any other industrial sector, says that if patients refuse to pay exorbitant drug prices, then the drug industry will refuse to produce life-saving drugs.

“The policy proposals they recommend,” said Robert Zirkelbach, spokesman for PhRMA, “would, if adopted, send a chilling signal to the marketplace that risk-taking will no longer be rewarded, stopping innovation in its tracks and halting decades of progress in cancer care.”

Opposition to rising drug prices is growing. According to a Health Tracking Poll published last month by the Kaiser Family Foundation, 72 percent of Americans feel that drug costs are unreasonable, while 74 percent felt that drug companies put profits before people.

Why the Rich Love Burning Man

Burning Man became a festival that rich libertarians love because it never had a radical critique at its core.


In principle the annual Burning Man festival sounds a bit like a socialist utopia: bring thousands of people to an empty desert to create an alternative society. Ban money and advertisements and make it a gift economy. Encourage members to bring the necessary ingredients of this new world with them, according to their ability.

Introduce “radical inclusion,” “radical self-expression,” and “decommodification” as tenets, and designate the alternative society as a free space, where sex and gender boundaries are fluid and meant to be transgressed.

These ideas — the essence of Burning Man — are certainly appealing.

Yet capitalists also unironically love Burning Man, and to anyone who has followed the recent history of Burning Man, the idea that it is at all anticapitalist seems absurd: last year, a venture capitalist billionaire threw a $16,500-per-head party at the festival, his camp a hyper-exclusive affair replete with wristbands and models flown in to keep the guests company.

Burning Man is earning a reputation as a “networking event” among Silicon Valley techies, and tech magazines now send reporters to cover it. CEOs like Mark Zuckerberg of Facebook and Larry Page of Alphabet are foaming fans, along with conservative anti-tax icon Grover Norquist and many writers of the libertarian (and Koch-funded) Reason magazine. Tesla CEO Elon Musk even went so far as to claim that Burning Man “is Silicon Valley.”

Radical Self-Expression

The weeklong Burning Man festival takes place once a year over Labor Day weekend in a remote alkali flat in northwestern Nevada. Two hours north of Reno, the inhospitable Black Rock Desert seems a poor place to create a temporary sixty-thousand-person city — and yet that’s entirely the point. On the desert playa, an alien world is created and then dismantled within the span of a month. The festival culminates with the deliberate burning of a symbolic effigy, the titular “man,” a wooden sculpture around a hundred feet tall.

Burning Man grew from unpretentious origins: a group of artists and hippies came together to burn an effigy at Baker Beach in San Francisco, and in 1990 set out to have the same festival in a place where the cops wouldn’t hassle them about unlicensed pyrotechnics. The search led them to the Black Rock Desert.

Burning Man is very much a descendent of the counterculture San Francisco of yesteryear, and possesses the same sort of libertine, nudity-positive spirit. Some of the early organizers of the festival professed particular admiration for the Situationists, the group of French leftists whose manifestos and graffitied slogans like “Never Work” became icons of the May 1968 upsurge in France.

Though the Situationists were always a bit ideologically opaque, one of their core beliefs was that cities had become oppressive slabs of consumption and labor, and needed to be reimagined as places of play and revolt. Hence, much of their art involved cutting up and reassembling maps, and consuming intoxicants while wandering about in Paris.

You can feel traces of the Situationists when walking through Black Rock City, Burning Man’s ephemeral village. Though Black Rock City resembles a city in some sense, with a circular dirt street grid oriented around the “man” sculpture, in another sense it is completely surreal: people walk half-naked in furs and glitter, art cars shaped like ships or dragons pump house music as they purr down the street.

Like a real city, Burning Man has bars, restaurants, clubs, and theaters, but they are all brought by participants because everyone is required to “bring something”:

The people who attend Burning Man are no mere “attendees,” but rather active participants in every sense of the word: they create the city, the interaction, the art, the performance and ultimately the “experience.” Participation is at the very core of Burning Man.

Participation sounds egalitarian, but it leads to some interesting contradictions. The most elaborate camps and spectacles tend to be brought by the rich because they have the time, the money, or both, to do so. Wealthier attendees often pay laborers to build and plan their own massive (and often exclusive) camps. If you scan San Francisco’s Craigslist in the month of August, you’ll start to see ads for part-time service labor gigs to plump the metaphorical pillows of wealthy Burners.

The rich also hire sherpas to guide them around the festival and wait on them at the camp. Some burners derogatorily refer to these rich person camps as “turnkey camps.

Silicon Valley’s adoration of Burning Man goes back a long way, and tech workers have always been fans of the festival. But it hasn’t always been the provenance of billionaires — in the early days, it was a free festival with a cluster of pitched tents, weird art, and explosives; but as the years went on, more exclusive, turnkey camps appeared and increased in step with the ticket price — which went from $35 in 1994 to $390 in 2015 (about sixteen times the rate of inflation).

Black Rock City has had its own FAA-licensed airport since 2000, and it’s been getting much busier. These days you can even get from San Carlos in Silicon Valley to the festival for $1500. In 2012, Mark Zuckerberg flew into Burning Man on a private helicopter, staying for just one day, to eat and serve artisanal grilled cheese sandwiches. From the New York Times:

“We used to have R.V.s and precooked meals,” said a man who attends Burning Man with a group of Silicon Valley entrepreneurs. (He asked not to be named so as not to jeopardize those relationships.) “Now, we have the craziest chefs in the world and people who build yurts for us that have beds and air-conditioning.” He added with a sense of amazement, “Yes, air-conditioning in the middle of the desert!”

The growing presence of the elite in Burning Man is not just noticed by outsiders — long-time attendees grumble that Burning Man has become “gentrified.” Commenting on the New York Times piece, burners express dismay at attendees who do no work. “Paying people to come and take care of you and build for you . . . and clean up after you . . . those people missed the point.”

Many Burners seethed after reading one woman’s first-person account of how she was exploited while working at the $17,000-per-head camp of venture capitalist Jim Tananbaum. In her account, she documented the many ways in which Tananbaum violated the principles of the festival, maintaining “VIP status” by making events and art cars private and flipping out on one of his hired artists.

Tananbaum’s workers were paid a flat $180 a day with no overtime, but the anonymous whistleblower attests that she and others worked fifteen- to twenty-hour days during the festival.

The emergent class divides of Burning Man attendees is borne out by data: the Burning Man census (yes, they have a census, just like a real nation-state) showed that from 2010 to 2014, the number of attendees who make more than $300,000 a year doubled from 1.4% to 2.7%. This number is especially significant given the outsize presence 1 percenters command at Burning Man.

In a just, democratic society, everyone has equal voice. At Burning Man everyone is invited to participate, but the people who have the most money decide what kind of society Burning Man will be — they commission artists of their choice and build to their own whims. They also determine how generous they are feeling, and whether to withhold money.

It might seem silly to quibble over the lack of democracy in the “governance” of Black Rock City. After all, why should we care whether Jeff Bezos has commissioned a giant metal unicorn or a giant metal pirate ship, or whether Tananbaum wants to spend $2 million on an air-conditioned camp? But the principles of these tech scions — that societies are created through charity, and that the true “world-builders” are the rich and privileged — don’t just play out in the Burning Man fantasy world. They carry over into the real world, often with less-than-positive results.

Remember when Facebook CEO Mark Zuckerberg decided to help “fix” Newark’s public schools? In 2010, Zuckerberg — perhaps hoping to improve his image after his callous depiction in biopic The Social Network donated $100 million to Newark’s education system to overhaul Newark schools.

The money was directed as a part of then–Newark Mayor Cory Booker’s plan to remake the city into the “charter school capital of the nation,” bypassing public oversight through partnership with private philanthropists.

Traditionally, public education has been interwoven with the democratic process: in a given school district, the community elects the school board every few years. School boards then make public decisions and deliberations. Zuckerberg’s donation, and the project it was attached to, directly undermined this democratic process by promoting an agenda to privatize public schools, destroy local unions, disempower teachers, and put the reins of public education into the hands of technocrats and profiteers.

This might seem like an unrelated tangent — after all, Burning Man is supposed to be a fun, liberating world all its own. But it isn’t. The top-down, do what you want, radically express yourself and fuck everyone else worldview is precisely why Burning Man is so appealing to the Silicon Valley technocratic scions.

To these young tech workers — mostly white, mostly men — who flock to the festival, Burning Man reinforces and fosters the idea that they can remake the world without anyone else’s input. It’s a rabid libertarian fantasy. It fluffs their egos and tells them that they have the power and right to make society for all of us, to determine how things should be.

This is the dark heart of Burning Man, the reason that high-powered capitalists — and especially capitalist libertarians — love Burning Man so much. It heralds their ideal world: one where vague notions of participation replace real democracy, and the only form of taxation is self-imposed charity. Recall Whole Foods CEO John Mackey’s op-ed, in the wake of the Obamacare announcement, in which he proposed a healthcare system reliant on “voluntary, tax-deductible donations.”

This is the dream of libertarians and the 1 percent, and it reifies itself at Burning Man — the lower caste of Burners who want to partake in the festival are dependent on the whims and fantasies of the wealthy to create Black Rock City.

Burning Man foreshadows a future social model that is particularly appealing to the wealthy: a libertarian oligarchy, where people of all classes and identities coexist, yet social welfare and the commons exist solely on a charitable basis.

Of course, the wealthy can afford more, both in lodging and in what they “bring” to the table: so at Burning Man, those with more money, who can bring more in terms of participation, labor and charity, are celebrated more.

It is a society that we find ourselves moving closer towards the other 358 (non–Burning Man) days of the year: with a decaying social welfare state, more and more public amenities exist only as the result of the hyper-wealthy donating them. But when the commons are donated by the wealthy, rather than guaranteed by membership in society, the democratic component of civic society is vastly diminished and placed in the hands of the elite few who gained their wealth by using their influence to cut taxes and gut the social welfare state in the first place.

It’s much like how in my former home of Pittsburgh, the library system is named for Andrew Carnegie, who donated a portion of the initial funds. But the donated money was not earned by Carnegie; it trickled up from his workers’ backs, many of them suffering from overwork and illness caused by his steel factories’ pollution. The real social cost of charitable giving is the forgotten labor that builds it and the destructive effects that flow from it.

At Burning Man the 1 percenters — who have earned their money in the same way that Carnegie did so long ago — show up with an army of service laborers, yet they take the credit for what they’ve “brought.”

Burning Man’s tagline and central principle is radical self-expression:

Radical self-expression arises from the unique gifts of the individual. No one other than the individual or a collaborating group can determine its content. It is offered as a gift to others. In this spirit, the giver should respect the rights and liberties of the recipient.

The root of Burning Man’s degeneration may lie in the concept itself. Indeed, the idea of radical self-expression is, at least under the constraints of capitalism, a right-wing, Randian ideal, and could easily be the core motto of any of the large social media companies in Silicon Valley, who profit from people investing unpaid labor into cultivating their digital representations.

It is in their interest that we are as self-interested as possible, since the more we obsess over our digital identity, the more personal information of ours they can mine and sell. Little wonder that the founders of these companies have found their home on the playa.

It doesn’t seem like Burning Man can ever be salvaged, or taken back from the rich power-brokers who’ve come to adore it and now populate its board of directors. It became a festival that rich libertarians love because it never had a radical critique at its core; and, without any semblance of democracy, it could easily be controlled by those with influence, power, and wealth.

Burning Man will be remembered more as the model for Google CEO Larry Page’s dream of a libertarian state, than as the revolutionary Situationist space that it could have been.

As such, it is a cautionary tale for radicals and utopianists. When “freedom” and “inclusion” are disconnected from democracy, they often lead to elitism and reinforcement of the status quo.

Hiding in plain sight: the history of the War on Drugs

Hiding in plain sight: the history of the War on Drugs

By Paul Bermanszohn

On August 13, 2015

Post image for Hiding in plain sight: the history of the War on DrugsThe War on Drugs was a direct response to the African American uprisings of the 1960s. Its racist and repressive effects continue to be felt today.

Photo: A scene of the 1967 Newark Rebellion, by Don Hogan Charles.

Recent US history, from the 1960s until today, shows the War on Drugs to be a crusade of repression against African American people, incarcerating millions to prevent a renewal of the struggle for freedom.

We need to look at the whole picture of this drug war, not just a fragment or a piece of it. Most writers on this subject either get lost in the details or cannot see past the lie that the US is a “democracy.” In either case they often fail to see the realities of this history, even though the facts are clear. Presenting well-known events in chronological order clarifies the inner connection among these events and brings out their larger significance.

Indeed, placing the history in sequence makes it plain: the Great Migration brought on a Great Rebellion. A vindictive Great Repression was orchestrated to crush the Great Rebellion and prevent its continuation. Masked as the so-called “War on Drugs,” which has swept millions into prisons and jails across the US, the Great Repression has, in effect, punished generations for the “sins” of their ancestors — those who dared to rebel.

This repression is still underway today. Its effects are clearly racial. But, camouflaged as a “War on Drugs,” it has allowed the country’s rulers to appear “colorblind” or race-neutral — as if they are merely enforcing the law.

The Great Migration

In the early 20th century, fleeing the decaying Jim Crow system of agricultural labor in the fields and farms of the South, millions of African Americans moved out, seeking jobs in the military-industrial centers of the North, the mid West and the West. From World War I to the 1960s, millions migrated from virtual chattel slavery in the South to wage slavery in the North. They found little improvement.

Herded into old ghettos, or into quickly-created new ones, they found discrimination, barely habitable housing with a constant threat of dislocation by projects of urban renewal, or “Negro removal.” Giant housing projects, little more than stacks of shacks, were built to house the many migrants. Overcrowded and neglected schools provided poor or non-existent education for their children.

The misery was compounded by relentless police abuse. When Malcolm X spoke of “the so-called Negro out here catching hell,” he was talking about (and to) this group. Malcolm lived this experience and became the spokesman of urban ghetto dwellers. The desperation and outrage experienced by these migrants made explosion inevitable.

The Great Rebellion

Violent repression of civil rights demonstrators seeking basic respect combined with the migrants’ sufferings to ignite a series of mass urban uprisings across the US. These insurrections are generally seen as individual explosions, city by city, but to grasp their cumulative significance we need to see them as a single process: African Americans striving for freedom in racist America. The rebellion was at the heart of the ’60s and drives American politics to this day, even under the nation’s first black president.

These rebellions are generally dismissed as “riots” and their significance erased.

Kenneth Stahl titled his website and book on the Detroit Rebellion of 1967 The Great Rebellion, but I expand the use of this term to include all these uprisings. Virtually all were precipitated by violent police attacks or rumors of such attacks. Since officials often lie, it is impossible to know what exactly happened in every case, but at any rate a large number of uprisings took place across the country: over 300 cities rose up in the ‘60s, according to the best estimates.

The first insurrection, in New York City, was touched off by a police murder. The initial focus of the demonstration, called for by the Congress of Racial Equality (CORE), was the disappearance of three civil rights workers in Mississippi. However, when in the early morning of July 16, off-duty police Lieutenant Thomas Gilligan killed 15-year-old African American student James Powell, CORE decided to change the focus of their protest to police brutality in Harlem.

The protest was peaceful, but rage at the murder grew into a mass confrontation with police. Bands of looters operated in Harlem’s streets at night. Upheaval soon spread to Bedford Stuyvesant. After the New York City insurrection abated, like a series of aftershocks, smaller uprisings took place throughout the area, in upstate NY, NJ and Pennsylvania.

A year later, on August 11, unrest broke out in Watts, LA. Among the first targets of looters were gun stores — and they made full use of their weapons. For almost a week, people fought the police and army to a standstill. Black and white looters working together led King to state that “this was not a race riot. It was a class riot.” The Situationist International even treated the rebellion as a “revolutionary event,” with looting seen as a rejection of the commodity system, “the first step of a vast, all-embracing struggle.”

In 1966, there were 43 civil disturbances of varying intensity across the nation, including a notable uprising in Chicago, where the Puerto Rican community exploded into a week-long rebellion after a police shooting. On April 4, 1967, King delivered what is probably his most important speech: Beyond Vietnam: A Time to Break Silence. The relevance of this speech is often downplayed, and if mentioned at all, it tends to be portrayed as King’s speech opposing the US war in Vietnam. It was much more.

In the address, King embraced the world revolution saying, “if we are to get on to the right side of the world revolution, we as a nation must undergo a radical revolution of values. We must rapidly begin the shift from a thing-oriented society to a person-oriented society.” He called the US government “the greatest purveyor of violence in the world today” and called for an end to “the giant triplets of racism, materialism and economic exploitation.”

The speech galvanized the anti-war movement. Just eleven days later, on April 15, 1967, over 400,00 people marched to the UN to demand an end to the war. It was the first demonstration I ever attended. I vividly remember the excitement in the gathering place, Central Park’s Sheep Meadow, still packed with marchers, when word came that the front of the march, which filled the streets the whole way, had reached the UN over a mile away. The movement’s power continued to grow as the spirit of revolution spread.

In just a few years, the US military began to disintegrate. Eighty percent of soldiers were taking drugs. Combat refusals, naval mutinies and fragging incidents — soldiers shooting their officers — became widespread.

In 1967, over a hundred instances of violent upheaval were recorded. Most notably were the uprisings in Newark, were the violence was sparked by rumors of a black cab driver being killed by police after decades of housing discrimination and massive black unemployment, and the one in the Motor City, Detroit, where 43 people were killed after 12,000 soldiers descended upon the city in an attempt to quell the protests.

The Great Repression

The year ’68 proved to be the watershed. The Rebellion reached its peak and the initiative was seized by the forces of order, who subsequently organized the Great Repression. On April 4, 1968, Martin Luther King was killed, probably bygovernment assassination. His murder, one year to the day after his revolutionary speech, strikes some as a signal sent by the government to deter people from taking the revolutionary path. If this is so, it did not work. Following King’s murder the largest insurrection occurred. Over 100 cities exploded.

The Holy Week Uprising was the most serious bout of social upheaval in the United States since the Civil War. The largest insurrections took place inWashington, D.C., Baltimore, Louisville, Kansas City, and Chicago — with Baltimore experiencing the most significant political events. The Liberal Republican Governor of Maryland, Spiro T. Agnew, gathered African American community leaders and subjected them to a dressing down for not supporting the US government strongly enough. Seeking to divide and conquer, he said: “I call upon you to publicly repudiate, condemn and reject all black racists. This, so far, you have not been willing to do.”

Agnew’s speech received national headlines and led to his role in the presidential elections later that year, which centered on the urban uprisings of the preceding decade and created the miserable legacy of today. US politicians refined a coded language to conceal their racial motives. The Republican candidate Richard Nixon ran against the liberal Democrat Hubert Humphrey. The civil rights movement drove not only the KKK; it also drove overtly racist language underground. It did not end either.

The election centered on Nixon’s call for “law and order,” a slogan that meant a tough response to insurgents (called “rioters”) and the still popular notion that politicians should be “tough on crime.” Crime, disorder and violence became synonyms for being black.

Nixon eagerly stated to work on a war on drugs before his inauguration. Early in his presidency, he outlined his basic strategy to his chief of staff: “[President Nixon] emphasized that you have to face the fact that the whole problem is really the blacks. The key is to devise a system that recognizes this while not appearing to.”

Nixon’s diabolical efforts to develop a War on Drugs along these lines involved the highest officials in the US government, including William Rehnquist, later appointed Chief Justice of the Supreme Court by Reagan. Nixon initiated a war on crime as well as the War on Drugs, setting the pattern for future presidents.

Following in his predecessors’ footsteps, Reagan outdid Nixon in his get-tough-on-crime policies and oversaw the steepest rise in incarceration rates. Bill Clinton signed into law an omnibus crime bill in 1994, increasing capital offenses and the federal “three strikes” provision mandating life sentences for criminals convicted of a violent felony after two or more prior convictions, including drug crimes. He poured over $30 billion into militarizing the nation’s police. His group, the Democratic Leadership Council, brought much of the Democratic Party to embrace coded racial politics in order to win over white voters.

For a new beginning

As a movement to stop violent police repression grows across the nation, some of our current rulers seem to understand that they have a tiger by the tail. The Clinton team has begun to suggest that mass incarceration might end. Clinton, herself, as part of her presidential campaign, called “for a re-evaluation of prison sentences and trust between police and communities.”

The Black Lives Matter movement recognizes that discontent fueled by mass incarceration contributes to the movement to stop police murders. Less well-recognized is that granting the police immunity is itself part of the generalized repression of African Americans. The system of mass incarceration rests on a high degree of police discretion in choosing whom to suspect, interrogate and arrest, and in how to do these things. Restricting the police can hardly be allowed if the police are to continue the overall project of racial repression.

Part of developing a new revolutionary movement is to reclaim our history. The masters keep us enslaved by blinding us to our collective strength. The story of the ‘60s uprisings is one rich in power and agency; this is the reason why the rulers want to erase this period from the collective memory altogether.

At the same time, we must also recognize that the uprisings of the ’60s failed. Despite the vast strength revealed in the Great Rebellion, our enemies were able to use the images of violence and looting to further the divisions in US society and to institute their vengeful repression with at least the passive consent of the “white” majority. Time and again, the mainstream media proved a powerful tool in promoting the image of black and brown people as violent, criminal and dangerous.

It must be acknowledged that widespread looting and violence frightened the “white” majority, making it easier for the rulers to split the people and institute the Great Repression. King’s revolutionary non-violence had a much different effect on the American people. This must be pondered by serious revolutionaries.

Conditions for a new revolutionary movement are gradually maturing. There are growing rebellions seeking a new way of life throughout the world. In the US, an ever-spreading movement affirms the value of black lives as increasing numbers of European-American youth take up the struggle of African Americans as their own. Such a movement may, in time, bring an end to the socially constructed notion of whiteness, eliminating a key pillar of the rulers’ domination.

In the Virginia colony in the 17th century, the masters were horrified to see African and European laborers combine to seek to destroy the system of enslavement. Their response was to create a sharp division in condition between their African and their Europeans slaves. They “invented” the white race to split the laborers and preserve their power — a remarkably effective and durable approach.

Race is a social construct devised and manipulated by our masters to maintain their rule. Only by eliminating class society, which continues to depend on racism, can racism as such be swept away.

Paul Bermanszohn, son of Holocaust survivors, is a retired psychiatrist and lifelong political revolutionary. He was shot in the head in an assassination attempt in the 1979 Greensboro Massacre, in which five of his close comrades were killed. His web site is Survival and Transformation.

This article is an edited version of a talk presented at a meeting of the End the New Jim Crow Action Network, on 14 July 2015 (Bastille Day), Kingston, NY.

Amy, a documentary film about the British singer Amy Winehouse

By Joanne Laurier
12 August 2015

Directed by Asif Kapadia

British-born director Asif Kapadia’s documentary, Amy, about the pop singer Amy Winehouse (1983-2011), is a straightforward and compelling account of the performer’s life starting at the age of fourteen. Through video footage from a variety of devices and the voiceover comments of friends, family members and music industry figures (Kapadia conducted 100 interviews), the documentary paints a picture of an immensely talented and tortured musical prodigy.

During her eight-year recording career, beginning when she was still a teenager, Winehouse garnered numerous awards, including six Grammys. Her second album, Back to Black, released in October 2006, made her an international singing star. By the time of her death, she had sold more than six million albums in the UK and US alone. Kapadia’s film features a number of her biggest hits, “Rehab” (2006), “You Know I’m No Good” (2007), “Back to Black” (2007), “Love is a Losing Game” (2007), and her soulful duet with Tony Bennett, “Body and Soul” (2011).


From an early age, as the documentary reveals, Winehouse aspired above all to be a jazz singer. Among her most important influences were Dinah Washington, Sarah Vaughan, Ella Fitzgerald, Billie Holiday, Frank Sinatra, Bennett and others. But she also channeled many of the pop artists and trends of the 1960s and 1970s, including Motown, R&B, reggae, Carole King, James Taylor and “girl groups” like the Shirelles and the Ronettes. In her music and extraordinary voice one encounters a multitude of influences, each one distinct and yet blended together to create a personal and unique sound. A record industry figure notes that she was a “very old soul in a very young body.”

After Winehouse’s death, Bennett commented: “It was such a sad thing because … she was the only singer that really sang what I call the ‘right way,’ because she was a great jazz-pop singer.…A true jazz singer.”

The movie opens with footage of a close friend’s 14th birthday party in 1998, at which Winehouse offers an alluring, mischievous version of “Happy Birthday” à la Marilyn Monroe, and ends with the aftermath of her tragic death from alcohol poisoning in July 2011 at the age of 27.

A friend observes at one point that she was “a North London Jewish girl with a lot of attitude.” Her father Mitchell owned a cab and her mother Janis was a pharmacist. Her paternal grandmother Cynthia was a singer and at one time dated Ronnie Scott, the tenor saxophonist and owner of the best-known jazz club in London.

Kapadia’s Amy follows Winehouse from her teenage years to the beginnings of her professional music career in 2002 and beyond. We see a host of appearances and performances, both private and public, some of them intensely intimate and very affecting to the viewer. In some of these scenes, the young singer is disarmingly genuine, childlike and really adorable.

Three of Winehouse’s friends, including two from childhood, Juliette Ashby and Lauren Gilbert, and her first manager (when he was 19 and Winehouse was 16), Nick Shymansky, provide the most in-depth and believable portrait. Her father, her ex-husband Blake Fielder and her promoter-manager Raye Cosbert also feature prominently in the film, in a less favorable light.

At a certain point, of course, Amy gets down to business, which every viewer knows is coming—the singer’s [meteoric] Rise and [tragic] Fall, as it were. Much of the documentary details her successes and the severe complications or contradictions accompanying those successes.


Winehouse insists—and one feels, sincerely—on several occasions that celebrity and what comes with it is not her goal. As she told CNN in a 2007 interview, “I don’t write songs because I want my voice to be heard or I want to be famous or any of that stuff. I write songs about things I have problems with and I have to get past them and I have to make something good out of something bad.” Early on in the film, in fact, she asserts, “I’d probably go mad [if I were famous].” Later on: “If I really thought I was famous, I’d f—g go top [kill] myself.”

Tragically, Winehouse, already a bulimic since her adolescence and a heavy drinker early on in life, falls into heavy drug use. Kapadia’s documentary focuses perhaps too much on this aspect, as though this by itself could explain her fate.

The film effectively captures some of the ghastliness of the modern celebrity racket. Countless scenes record paparazzi camped outside her door and snapping photos of her every move, including the most crazed and desperate. Her ex-manager Shymansky told an interviewer, “the paparazzi were allowed to get brutally close…it’s this infatuation with getting up people’s skirts, or seeing someone vomit, or punching a paparazzi.”

As Winehouse goes to pieces in public, the media engages in what Shymansky calls, in the film, “a feeding frenzy.” Kapadia himself told the media, “This is a girl who had a mental illness, yet every comedian, every TV host, they all did it [bullied or laughed at her] with such ease, without even thinking. We all got carried away with it.”

The production notes for Amy suggest: “The combination of her raw honesty and supreme talent resulted in some of the most original and adored songs of the modern era.

“Her huge success, however, resulted in relentless and invasive media attention which coupled with Amy’s troubled relationships and precarious lifestyle saw her life tragically begin to unravel.”

The inquest into Winehouse’s death, according to the Daily Mail, found that she “drank herself to death … Three empty vodka bottles were found near her body in her bedroom. A pathologist who examined her said she had 416mg [milligrams] of alcohol per decilitre [3.38 fluid ounces] of blood—five times the legal drink-drive limit of 80mg. The inquest heard that 350mg was usually considered a fatal amount.”

Kapadia’s documentary is both valuable and intriguing. Because the director lets Winehouse speak (and sing) for herself, the viewer receives a relatively clear-eyed and balanced picture of both her artistry and her qualities as a human being. Amy rightfully points a finger at a predatory industry. Kapadia told NME (New Musical Express, the British music journalism magazine and website), “I was angry, and I wanted the audience to be angry. … This started off as a film about Amy, but it became a film about how our generation lives.” NME continues, “Kapadia hopes his film will force the music industry to re-examine its handling of young, troubled talents.”

In the interview, unfortunately, the director places too much of the blame on the public itself, as though people were in control of the information they received and were responsible for the operations of the entertainment industry.Amy, at more then two hours, is perhaps overly long because the filmmaker seems intent on driving home to the viewer his or her supposed “complicity.”

In opposition to this, the 2011 WSWS obituary of Winehouse argued that the ultimate responsibility for her death lay with “the intense … pressures generated by the publicity-mad, profit-hungry music business, which chews up its human material almost as consistently as it spits out new ‘product.’”

Comic Russell Brand, in a comment on the death of Winehouse, a close friend, characterized the celebrity culture as “a vampiric, cannibalizing system that wants its heroes and heroines dead so it can devour their corpses in public for entertainment.”

Stepping back, Amy Winehouse was definitely a cultural phenomenon. As opposed to many acts and performers, who ride on the crest of massive marketing campaigns, like bars of soap or automobiles, she came by her fame honestly, almost in spite of her efforts. She truly struck a chord with audiences and listeners.

This was not an accident. Her songs, in part because they brought to bear (and made new) so much popular musical history, registered with audiences as more substantial, truthful and urgent than the majority of current fare. Winehouse’s popularity reflected a dissatisfaction with the lazy, self-absorbed pablum that dominates the charts.

In terms of the combination of factors that led to Winehouse’s death, of course there were the individual circumstances of her background and life. The entertainment industry juggernaut inflicts itself on everyone, but only the most vulnerable collapse under what is to them an unbearable burden.

As always in such cases, the media self-servingly treated the singer’s death as a purely personal episode. The Daily Mirror, for example, fatuously suggested that Winehouse was “a talent dogged by self destruction.”

Surely, something more than this, or what Amy offers as an explanation (drugs, difficult family history, a bad marriage, a cold-blooded industry), for that matter, is called for. Why would someone at the height of her global fame and popularity bring about the end of her life so abruptly and “needlessly”? What made her so wretched and conflicted?

To begin to get at an answer, one must look at the more general circumstances of her life, including the character of the period in which she lived…and died.

She came of age and later gained public attention between the years 2001 and 2011, in other words, a decade dominated by “the war on terror” and the politicians’ “big lie” and hostility to democratic rights (the Blair government in particular prided itself on flouting the public will), as well as by global economic turbulence and sharp social polarization. The generation she belonged to increasingly looked to the future with skepticism and even alarm. A serious darkness descended into more than one soul.

One study of American college-age students in the first decade of the 21st century, and this could certainly be applied to British young people as well, sums them up as a “generation on a tightrope,” facing an “abyss that threatens to dissolve and swallow them,” “seeking security but [living] in an age of profound and unceasing change.”

Amy Winehouse, as far as this reviewer is aware (or the film would indicate), never addressed a single broader social issue, including the Blair government’s involvement in the Iraq War, which provoked one of the largest protest demonstrations in British history in February 2003. Nonetheless, the drama, anxiety and sensitivity of her music speaks to something about both the turbulent character of the decade and the dilemma of a generation whose dreams and idealism came up against the brutal realities of the existing social set-up.

The manner in which Winehouse approached the latter “dilemma” was refracted through the objectively shaped confusion and difficulties of that same generation, which finds itself hostile toward dominant institutions and yet not entirely clear why. In that light, Winehouse’s famous refrain in “Rehab,” “They tried to make me go to rehab, but I said, ‘No, no, no,’” which from one point of view might simply seem self-indulgent, comes across as a firm (if misguided in this case) rejection of intrusive orders from above.

In part due to an unfavorable and unsympathetic social atmosphere, a conscious or unconscious alienation from official public life, Winehouse turned inward and reduced these significant feelings into purely personal passions and self-directed anger, and ultimately, with her lowered psychic immune system, found herself a victim of that rage and disorientation.

Kapadia’s Amy does not go anywhere near some of these critical issues, but it is a worthwhile introduction to the work of a remarkably gifted artist.

Fifty years on: Medicare under assault


31 July 2015

President Lyndon B. Johnson signed Medicare, the national health insurance program for Americans 65 years of age and older, into law on July 30, 1965. Medicare and the accompanying Medicaid health program for the poor were the last major social reforms enacted in the US and came at a time of intense crisis for American capitalism.

The mid-1960s saw a nation gripped by the civil rights movement and militant struggles by workers for higher wages and improved social conditions. Two weeks before Johnson signed the Medicare bill, a riot broke out in Harlem, New York following the shooting of a black teenager, one of the earliest of the numerous urban rebellions that would erupt over the next three years.

In the US pursuit of global domination, on March 8, 1965, 3,500 US Marines were dispatched to South Vietnam, marking the beginning of the US ground war in Southeast Asia. Only two days before signing Medicare into law, Johnson announced the doubling of draft quotas and the dispatch of another 50,000 troops to Vietnam. The war would end in a humiliating defeat for US imperialism a decade later, after the deaths of more than 58,000 Americans and millions of Vietnamese.

As with the Social Security Act under Franklin D. Roosevelt in 1935 and the establishment of industrial unions, Medicare was not granted out of the kindness of the hearts of the ruling class. It came as a concession to mass struggles carried out by the working class.

However, by today’s standards, passages from the Democratic Party platform on which Johnson ran in 1964 sound radical. In a section titled “The Individual,” the platform reads: “The health of the people is important to the strength and purpose of our country and is a proper part of our common concern. In a nation that lacks neither compassion nor resources, the needless suffering of people who cannot afford adequate medical care is intolerable.”

From the start, Medicare fell far short of providing free and comprehensive medical care for all seniors. As originally enacted, the program provided for inpatient hospital care (Part A) as well as certain outpatient services (Part B), including preventive services, ambulance transport, mental health and other medical services. Part B has always required a premium payment.

In 1972, President Richard Nixon signed legislation expanding coverage for those under age 65 with long-term disabilities and end-stage renal disease. Since 1997, enrollees had the option to enroll in Medicare Advantage (Part C), managed care programs administered by private companies. It was not until 2002 that optional prescription drug benefits (Part D), exclusively provided through private plans, were added under George W. Bush.

It is important to note that all components of Medicare, except for Part A in certain instances, carry premiums and deductibles. Despite these shortcomings, Medicare represented an important, albeit limited, advance in health care for seniors that was denounced as “socialism” in many ruling class circles.

The Medicare legislation faced significant opposition in both big business parties. The Democratic vote in favor of the bill was 57-7 in the Senate and 237-48 in the House. The Republicans opposed the bill 13-17 in the Senate and narrowly approved it in the House, 70-68.

Hostility to the legislation among leading Republicans was vociferous. Senator Barry Goldwater commented in 1964: “Having given our pensioners their medical care in kind, why not food baskets, why not public housing accommodations, why not vacation resorts, why not a ration of cigarettes for those who smoke and of beer for those who drink?”

In 1964, future president George H.W. Bush denounced the impending Medicare bill as “socialized medicine.” While it was nothing of the sort, it was seen by many supporters as a first step toward the establishment of universal health care.

Despite its limitations, it is undisputable that the program has had an immense impact on the health and social wellbeing of the elderly population.

Largely as a result of Medicare and improved medical technologies, life expectancy at age 60 increased from 14.3 years in 1960 to 19.3 years in 2012. Prior to Medicare, about half of America’s seniors did not have hospital insurance, more than one in four elderly went without medical care due to cost, and one in three seniors lived in poverty.

Some 53 million elderly are currently enrolled in Medicare. Today, virtually all seniors have access to health care and only about 14 percent live below the poverty line. Despite a relentless attack on Medicare services in recent years, Medicare is extremely popular—with 77 percent of Americans viewing it as a “very important” program that needs to be defended, according to a recentpoll.

The program has been under assault from sections of the political establishment and corporate America since its inception. In 1995, under the leadership of then-House Speaker Newt Gingrich, Republicans proposed cutting 14 percent from projected Medicare spending and forcing millions of elderly recipients into managed health programs. The aim, in Gingrich’s words, was to ensure that Medicare was “going to wither on the vine.”

In the most open threat to privatize Medicare, in the spring of 2014, Rep. Paul Ryan, Republican of Wisconsin, released a “Path to Prosperity” budget plan that slashed $5.1 trillion over 10 years. Key to his blueprint was the institution of “premium support” in health care for seniors, essentially a voucher plan under which seniors could purchase either private insurance or Medicare coverage.

Fast-forward to the current presidential campaign. Republican candidate Jeb Bush, speaking at an event last week in New Hampshire sponsored by the billionaire Koch brothers, said of Medicare: “We need to figure out a way to phase out this program … and move to a new system that allows them [those over 65] to have something—because they’re not going to have anything.”

Bush and others justify their proposals to privatize or outright abolish Medicare with claims that the program will be bankrupt in the near future. But a recent report shows that projected Medicare spending will account for 6 percent of Gross Domestic Product by 2090, down from earlier projections that it would make up 13 percent of GDP in 2080.

This is hardly an unreasonable amount to spend on the health of the nation’s elderly population. This spending is also not a gift from the government, but is funded through deductions from the paychecks of workers all their working lives. However, the policy decisions of politicians in Washington are not driven by preserving the health and welfare of America’s older citizens, but by the defense of the capitalist profit system.

While President Obama and the Democrats seek to distance themselves from proposals to privatize Medicare, Ryan and Bush only openly express what many Democrats are thinking. The Obama administration, with the Affordable Care Act (ACA) leading the charge, is working to gut Medicare and transform it into a poverty program with barebones coverage for the majority of working class and middle class seniors.

In 2013, the Congressional Budget Office estimated that the ACA would reduce Medicare spending by $716 billion from 2013 to 2022. Under the first four years of the ACA, home health care under Medicare is being cut by 14 percent, including $60 million in 2015 and $350 million in 2016. While doing nothing to rein in the outrageous charges by pharmaceutical companies for cancer and other life-saving drugs, the Obama administration’s proposed 2016 budget includes $126 billion in cuts from what Medicare will pay for these drugs.

In what constitutes a historic attack on the program, Obama hailed as a “bipartisan achievement” passage of a bill in April that expands means testing for Medicare and establishes a new payment system in which doctors will be rewarded for cutting costs, while being punished for the volume and frequency of the health care services they provide.

It is telling that an article in the right-wing National Review, headlined “A Medicare Bill Conservatives Need to Embrace,” hailed the legislation and said the effects of its structural reforms would be “permanent and cumulative.”

The bipartisan backing for the Medicare bill is based on common agreement that Medicare spending must be slashed and a radical shift instituted away from the “lavish” fee-for-service system, in which supposed “unnecessary” tests and procedures are performed on Medicare patients, needlessly treating disease and extending their lives.

The president has claimed that the enactment of the program commonly known as Obamacare is the most sweeping social reform since Medicare was signed into law. This is a cynical lie. The ACA is, in fact, a social counter-reform that was aimed from the start at cutting costs for the government and corporations and reducing and rationing health care for the majority of Americans.

The ACA is designed to encourage employers to slash or end their employee insurance plans, forcing workers to individually purchase plans from private companies on government-run exchanges. The result will be the dismantling of the employer-provided health insurance system that has existed since the early 1950s, a vast increase in workers’ out-of-pocket costs, and a decrease in the care they receive.

Medicare, one of the last vestiges of social reform from a previous era, along with Social Security, is being undermined. The social right to health care—along with the right to a livable income, education, housing, and a secure retirement—is incompatible with a society subordinated to capitalist forces.

True reform of the health care system requires that it be reorganized based on a socialist program that proceeds from the fulfillment of human needs, not the enrichment of a parasitic elite.

Kate Randall