When Poverty Is Profitable

A new book details how foster-care agencies and other safety-net programs hire consultants to maximize their funding and divert it from its intended use.

Carlo Allegri / Reuters
America’s safety-net programs are meant to help the poorest and most vulnerable access meet their basic needs—food, medical care, and safe housing—and there’s an ongoing debate about just how robust and successful these programs are.In his new book, The Poverty Industry: The Exploitation of America’s Most Vulnerable Citizens, Daniel L. Hatcher, suggests that the problems plaguing programs such as foster care and Medicaid are deeper and more troubling than most realize. Hatcher, a professor at the University of Baltimore’s School of Law, writes:

States and their human service agencies are partnering with private companies to form a vast poverty industry, turning America’s most vulnerable populations into a source of revenue … The resulting industry is strip-mining billions in federal aid and other funds from impoverished families, abused and neglected children, and the disabled and elderly poor.

How does this happen? Hatcher uses the example of the foster-care system, where some states enlist the help of private consultants to come up with strategies to maximize disability claims for children in its care. That results in higher payouts from the federal government. But instead of using that money to care for children, the money is diverted, and used for other things the state deems necessary.

I spoke with Hatcher about his book, how states and private groups help abuse the safety net, and how to prevent such abuses. The interview below has been lightly edited for clarity.

Gillian B. White: One of the central themes you discuss in the book is this premise of an “iron triangle” related to poverty.  The term “iron triangle” is a general way of describing the intersecting relationship between government agencies, special-interest groups, and legislators. Can you walk me through how this concept applies to the mechanisms that are supposed to protect the poorest Americans?

Daniel L. Hatcher: With poverty’s iron triangle, you have the relationship between the federal government, between the state government, and between the poverty-industry private contractors. Funds that are intended to help vulnerable populations are misused, either the states are misusing the funds and routing them into general coffers or sometimes other uses. And a significant amount of that federal aid is used to pay the private contractors.

White: You cite some pretty disturbing, egregious first-hand examples of poor people being gamed by the very services meant to protect them. How did you happen upon this information?

Hatcher: My first job at Legal Aid was representing children in the Baltimore City foster-care system. Over a year, I probably represented 300 children or more. That experience has stayed with me: Seeing what these children are going through both while they’re in foster care, what brought them into foster care, and how they’re struggling when they leave foster care. I encountered this practice where the agencies that exist to serve foster children are taking their resources. It really spurred me into action.

White: How does this exploitation work, say in the case of foster services?

Hatcher: You have foster agencies across the country—these agencies that exist to serve abused, neglected children—partnering with revenue-maximization consultants. One of the strategies they use is to go after children’s survivor and disability benefits. They hire contractors to try to increase the number of children in their care determined to be disabled, or target those children that have deceased parents. It’s not to provide additional resources to those children or actually use those services to improve services for the child’s conditions, but to take the money as a state-revenue source. Then contractors will cut sometimes a contingency fee, sometimes a flat fee, depending on the arrangements.

I represented a former foster child in Maryland. He had been in foster care since he was 12. His father died while he was in foster care. The father worked, paid into the system, and so his child earned a survivor benefit, which is much like life insurance. The state applied for that money on his behalf and applied to become a representative payee to take control of the money. And then took the money. The foster-care agency took his money without even telling him he had this benefit that his father had left him. So that harm is not only just taking a cash benefit from a maltreated child, but it’s taking away a connection to a deceased parent that can have a vast emotional benefit.White: And the foster-care system is just one example, right? Where else to these abuses exist?

Hatcher: What’s most rampant are Medicaid maximization and diversion strategies. For these plans, states target nursing homes or hospitals that serve a disproportionate share of the poor, or schools with disabled children. They claim additional funds and then route the federal aid to general state coffers.

There are some states that are using the money as intended, but I look at the states that are, in my view, misusing federal aid. New Jersey, for example: You have the Christie administration forcing school districts—by statute —to participate in a statewide project, working with a private contractor to maximize the number of schoolchildren receiving school-based Medicaid. The Medicaid funds are intended to help provide for special-education and special services for those children. But when the schools participate, they often have to pay for the state’s share of costs to start with, and that administration’s budget is taking over 80 percent of the funds from the disabled-school children to bolster general state coffers.

White: How would such flagrant abuses manage to go unchecked?

Hatcher: Most people have no idea this is happening or they can’t understand it when they read a report about this budget shell game.

Part of the problem is a lot of the economic theory behind fiscal federalism has ignored the relationship between government and private contractors, and ignored the fact that state governments are cash-strapped, so they’re looking for money wherever they can get it. They’re going after these sources of federal aid not to provide the aid services as intended, but as a general revenue mechanism.White: What’s the solution? Is there one?

Hatcher: The regulations that are implemented in terms of how states use money need more teeth. I don’t think that the federal government has been strong enough in making it absolutely clear that federal Medicaid funds are supposed to be used for their intended purposes. That could be done either through regulation or through legislation. Often the agencies already have the capacity under the current regulatory structure to do this. One concern is private contractors. You have states hiring private contractors to come up with all these illusory strategies to maximize federal-aid funds, and then the states will sometimes divert those funds to other use. Sometimes those very same private contractors are then hired by the federal government to try to reduce payout of federal-aid funds.

I think that the book provides strong evidence that block grants are a horrible idea. If you have a governor, under the current structure of Medicaid that includes complex, strict, regulatory requirements—in terms of how the money is supposed to be claimed and used—who’s still finding ways to use budget shell games to maximize and divert funds, what would that governor do if you just hand him a blank check? We need to have simplified systems, to be sure, and greater monitoring to ensure that money is being used as intended. But a block grant would lead to greater misuse of aid funds intended for vulnerable populations.

White: Money is at the heart of all of this and you talk a lot about how states are cash strapped. How can that obstacle be overcome?

Hatcher: Fairer tax mechanisms to raise revenue that the states need are necessary in order to fund government. You have foster agencies that are so underfunded that they’ve reached the point that they’re even looked to taking money from beneficiaries. That should be a huge red flag that something’s wrong with the way our government is financed.

I don’t think that the evidence and the research findings in the book support any conclusions to cut government aid. We’re already incredibly underfunded in terms of the services that are provided to low-income individuals and children. If you have a governor that’s misusing federal aid the answer isn’t to cut federal aid—it’s to stop the governor from misusing the federal aid.

White: Are you hopeful that these changes will actually happen?

Hatcher: Awareness is key here. It’s very simple: a realignment of purpose. States and agencies that exist to serve vulnerable populations need to stay true to their reason for existing.

When we become aware that foster-care agencies are taking funds for abused and neglected children, hopefully our outrage will increase to a level where we stop the practices. When we become aware that a state like New York is receiving an F grade in nursing-home care, but is using nursing homes to leverage additional federal funds and then taking those monies away from the nursing homes, I hope that outrage will increase to the point where we curtail those practices.



The old game of labor surveillance is finding new forms

Happy All the Time

As biometric tracking takes over the modern workplace, the old game of labor surveillance is finding new forms.

By Lynn Stuart Parramore

Call them soldiers, call them monks, call them machines: so they were but happy ones, I should not care.
Jeremy Bentham, 1787

Housed in a triumph of architectural transparency in Cambridge, Massachusetts, is the Media Lab complex at MIT, a global hub of human-machine research. From the outside of its newest construction, you can see clear through the building. Inside are open workspaces, glittering glass walls, and screens, all encouragement for researchers to peek in on one another. Everybody always gets to observe everybody else.

Here, computational social scientist Alex Pentland, known in the tech world as the godfather of wearables, directs a team that has created technology applied in Google Glass, smart watches, and other electronic or computerized devices you can wear or strap to your person. In Pentland’s quest to reshape society by tracking human behavior with software algorithms, he has discovered you don’t need to look through a glass window to find out what a person is up to. A wearable device can trace subliminal signals in a person’s tone of voice, body language, and interactions. From a distance, you can monitor not only movements and habits; you can begin to surmise thoughts and motivations.

In the mid-2000s Pentland invented the sociometric badge, which looks like an ID card and tracks and analyzes the wearer’s interactions, behavior patterns, and productivity. It became immediately clear that the technology would appeal to those interested in a more hierarchical kind of oversight than that enjoyed by the gurus of MIT’s high-tech playgrounds. In 2010 Pentland cofounded Humanyze, a company that offers employers the chance to find out how employee behavior affects their business. It works like this: A badge hanging from your neck embedded with microphones, accelerometers, infrared sensors, and a Bluetooth connection collects data every sixteen milliseconds, tracking such matters as how far you lean back in your chair, how often you participate in meetings, and what kind of conversationalist you are. Each day, four gigabytes’ worth of information about your office behavior is compiled and analyzed by Humanyze. This data, which then is delivered to your supervisor, reveals patterns that supposedly correlate with employee productivity.

IMAGE:Discovery of Achilles on Skyros, by Nicolas Poussin, c. 1649. © Museum of Fine Arts, Boston / Juliana Cheney Edwards Collection / Bridgeman Images. 

Humanyze CEO Ben Waber, a former student of Pentland’s, has claimed to take his cues from the world of sports, where “smart clothes” are used to measure the mechanics of a pitcher’s throw or the launch of a skater’s leap. He is determined to usher in a new era of “Moneyball for business,” a nod to baseball executive Billy Beane, whose data-driven approach gave his team, the Oakland Athletics, a competitive edge. With fine-grained biological data points, Waber promises to show how top office performers behave—what happy, productive workers do.

Bank of America hired Humanyze to use sociometric badges to study activity at the bank’s call centers, which employ more than ten thousand souls in the United States alone. By scrutinizing how workers communicated with one another during breaks, analysts came to the conclusion that allowing people to break together, rather than in shifts, reduced stress. This was indicated by voice patterns picked up by the badge, processed by the technology, and reported on an analyst’s screen. Employees grew happier. Turnover decreased.

The executives at Humanyze emphasize that minute behavior monitoring keeps people content. So far, the company has focused on loaning the badges to clients for limited study periods, but as Humanyze scales up, corporate customers may soon be able to use their own in-house analysts and deploy the badges around the clock.

Workers of the world can be happy all the time.

The optimists’ claim: technologies that monitor every possible dimension of biological activity can create faster, safer, and more efficient workplaces, full of employees whose behavior can be altered in accordance with company goals.

Widespread implementation is already underway. Tesco employees stock shelves with greater speed when they wear armbands that register their rate of activity. Military squad leaders are able to drill soldiers toward peak performance with the use of skin patches that measure vital signs. On Wall Street, experiments are ongoing to monitor the hormones of stock traders, the better to encourage profitable trades. According to cloud-computing company Rackspace, which conducted a survey in 2013 of four thousand people in the United States and United Kingdom, 6 percent of businesses provide wearable devices for workers. A third of the respondents expressed readiness to wear such devices, which are most commonly wrist- or head-mounted, if requested to do so.

The life of spies is to know, not be known.
– George Herbert, 1621

Biological scrutiny is destined to expand far beyond on-the-job performance. Workers of the future may look forward to pre-employment genetic testing, allowing a business to sort potential employees based on disposition toward anything from post-traumatic stress disorder to altitude sickness. Wellness programs will give employers reams of information on exercise habits, tobacco use, cholesterol levels, blood pressure, and body mass index. Even the monitoring of brain signals may become an office commonplace: at IBM, researchers bankrolled by the military are working on functional magnetic-resonance imaging, or fMRI, a technology that can render certain brain activities into composite images, turning thoughts into fuzzy external pictures. Such technology is already being used in business to divine customer preferences and detect lies. In 2006 a San Diego start-up called No Lie MRI expressed plans to begin marketing the brain-scanning technology to employers, highlighting its usefulness for employee screening. And in Japan, researchers at ATR Computational Neuro­science Laboratories have a dream-reading device in the pipeline that they claim can predict what a person visualizes during sleep. Ryan Hurd, who serves on the board of the International Association for the Study of Dreams, says such conditioning could be used to enhance performance. While unconscious, athletes could continue to practice; creative types could boost their imaginations.

The masterminds at Humanyze have grasped a fundamental truth about surveillance: a person watched is a person transformed. The man who invented the panopticon—a circular building with a central inspection tower that has a view of everything around itgleaned this, too. But contrary to most discussions of the “all-seeing place,” the idea was conceived not for the prison, but for the factory.

Jeremy Bentham is usually credited with the idea of the panopticon, but it was his younger brother, Samuel Bentham, who saw the promise of panoptical observation in the 1780s while in the service of Grigory Potemkin, a Russian officer and statesman. Potemkin, mostly remembered for creating fake villages to fool his lover, Catherine the Great, was in a quandary: his factories, which churned out everything from brandy to sailcloth, were a hot managerial mess. He turned to Samuel, a naval engineer whose inventions for Potemkin also  included the Imperial Vermicular, a wormlike, segmented 250-foot barge that could navigate sinuous rivers. Samuel summoned skilled craftsmen from Britain and set them to the hopeless task of overseeing a refractory mass of unskilled peasant laborers who cursed and fought in a babel of languages. Determined to win Potemkin’s favor, he hit on a plan for a workshop at a factory in Krichev that would allow a person, or persons, to view the entire operation from a central inspector’s lodge “in the twinkling of an eye,” as his brother Jeremy would later write in a letter. The inspector could at once evoke the omnipresence of God and the traditional Russian noble surrounded by his peasants. Laborers who felt themselves to be under the constant eye of the inspector would give up their drunken brawls and wife-swapping in favor of work.

War thwarted Samuel’s plans for the Krichev factory, eventually forcing him to return home to Britain, where, in 1797, he drew up a second panoptical scheme, a workhouse for paupers. Six years earlier, in 1791, Jeremy had borrowed Samuel’s idea to publish a work on the panoptical prison, built so that guards could see all of the inmates while the latter could only presume they were being watched, fostering “the sentiment of a sort of omnipresence” and “a new mode of obtaining power of mind over mind.” In America, the Society for Alleviating the Miseries of Public Prisons adopted panoptical elements for the Eastern State Penitentiary in Philadelphia, adding solitary confinement with the idea of delivering the maximum opportunity for prisoner repentance and rehabilitation. Visiting the prison in 1842, Charles Dickens noted that its chief effect on inmates was to drive them insane.

Before the days of industrialization, employers had little use for surveillance schemes. The master craftsman lived in his workshop, and his five to ten apprentices, journeymen, and hirelings occupied the same building or adjacent cottages, taking their behavioral cues from his patriarchal authority. The blacksmith or master builder or shoemaker interacted with his underlings in a sociable and informal atmosphere, taking meals with them, playing cards, even tippling rum and cider. Large-scale manufacturing swept this all away. Workmen left the homes of their employers; by the early decades of the nineteenth century, the family-centered workplace—where employers provided models of behavior, food, and lodging—was becoming a thing of memory.

IMAGE:Sacks full of Stasi files in the former Ministry for State Security headquarters, Berlin, 1996. © SZ Photo / Joker / David Ausserhofer / Bridgeman Images. 

Proto-industrialists found that their new employees, an ever-shifting mass of migrants and dislocated farm boys, found ample opportunities for on-the-job drunkenness, inattention, and fractious behavior. In his classic work A Shopkeeper’s Millennium, historian Paul
E. Johnson observes that in America an answer to this problem was found in the Protestant temperance movement just then blowing righteous winds across the Northeast. Managers found that the revival and the Sunday school could foster strict internal values that made constant supervision less important. Workers, if properly evangelized, would turn willingly from the bottle to the grueling business of tending power-driven machines. God would do the monitoring as He does it best—from the inside.

Unfortunately, God’s providential eye tended to blink in the absence of regular churchgoing. So in the 1880s and 1890s, mechanical engineer Frederick Winslow Taylor displaced God with scientific management systems, devising precise methods of judging and measuring workers to ensure uniformity of behavior and enhanced efficiency. Taylor’s zeal to scrutinize every aspect of work in the factory led to such inventions as a keystroke monitor that could measure the speed of a typist’s fingers. His methods of identifying underperforming cogs in the industrial machine became so popular that Joseph Wharton, owner of Bethlehem Steel, incorporated Taylor’s theories into the bachelor’s degree program in business he had founded at the University of Pennsylvania. Harvard University soon created a new master’s degree in business administration, the MBA, that focused on studying Taylorism.

Workplace surveillance didn’t evolve much beyond Taylor’s ideas until closed-circuit television brought prying to heights unimagined by the brothers Bentham. In 1990 the Olivetti Research Laboratory, in partnership with the University of Cambridge Computer Laboratory, announced an exciting new workplace-spying project aptly named Pandora. The Pandora’s Box processor handled video streams and controlled real-time data paths that allowed supervisors to peek in on remote workstations. An improved system launched in 1995 was named Medusa, after the Greek monster who turned victims to stone with her gaze.

By the early twenty-first century, electronic monitoring in the workplace became de facto, with bosses peering into emails, computer files, and browser histories. From the lowest-rung laborers to the top of the ivory tower, no employee was safe. In 2013 Harvard University was found to have snooped in the email accounts of sixteen faculty deans for the source of a media leak during a cheating scandal. Global positioning systems using satellite technology, which came to maturity by 1994 and grew popular for tracking delivery trucks, opened new methods of watching. Dennis Gray, owner of Accurid Pest Solutions, could satisfy a hunch in 2013 that workers were straying from their tasks. He quietly installed GPS tracking software on the company-issued smartphones of five of its drivers; one indeed was found to be meeting up with a woman during work hours. In 2015 Myrna Arias, a sales executive for money-transfer service Intermex, objected to her employer monitoring her personal time and turned off the GPS device that tracked her around the clock. She was fired.

Secrecy lies at the very core of power.
– Elias Canetti, 1960

Surveillance technology stirs up profound questions as to who may observe whom, under what conditions, for how long, and for what purpose. The argument for monitoring the vital signs of an airline pilot, whose job routinely holds lives at stake, may seem compelling, but less so for a part-time grocery store clerk. In a 1986 executive order President Ronald Reagan, expressing concern about the “serious adverse effects” of drug use on the workforce, which resulted in “billions of dollars of lost productivity each year,” instituted mandatory drug testing for all safety-sensitive executive-level and civil-service federal employees. A noble mission, perhaps, but prone to expand like kudzu: by 2006 it entangled up to three out of four jobseekers, from would-be Walmart greeters to washroom attendants, who were forced to submit to such degradations as peeing in a plastic jar, sometimes under the watchful eye of a lab employee. Thirty-nine percent could expect random tests after they were hired, as well as dismissal for using substances on or off the job and regardless of whether their use impaired performance. Job applicants often accordingly changed their behavior; one scheme involved ordering dog urine through the mail to fool the bladder inspectors.

At the 2014 Conference on Human Factors in Computing Systems, held in Toronto, participants noticed an unusual sign affixed to restroom doors: behavior at these toilets is being recorded for analysis. It had been placed there by Quantified Toilets (slogan:
every day. every time.), whose mission, posted on its website, states: “We analyze the biological waste process of buildings to make better spaces and happier people.” At the conference, Quantified Toilets was able to provide a real-time feed of analytical results. These piss prophets of the new millennium could tell if participants were pregnant, whether or not they had a sexually transmitted disease, or when they had drugs or alcohol in their system. (One man had showed up with a blood-alcohol level of 0.0072 percent and a case of gonorrhea.)

Quantified Toilets, it turned out, was not a real company, but a thought experiment for the conference, designed to provoke discussion about issues of privacy in a world where every facial expression, utterance, heartbeat, and trip to the bathroom can be captured to generate a biometric profile. Workplace surveillance, after all, is a regulatory Wild West; employees have few rights to privacy on the job. A court order may be necessary for police to track a criminal suspect, but no such niceties prevent an employer from exploring the boundaries of new technologies. History suggests that abuses will be irresistible: in 1988 the Washington, DC, police department admitted using urine tests to screen female employees for pregnancy without their knowledge.

Biosurveillance has strong allies, including its own Washington lobbying firm, the Secure Identity and Biometrics Association, committed to bringing new products to government, commercial, and consumer spheres. The VeriChip, a human-implantable microchip using radio frequency identification, allows scanners in range of the implant to access records and information about a person. It received FDA approval in 2004 (though the company later merged and became PositiveID). In Mexico, eighteen workers at the attorney
general’s office were required to have the rice-grain-sized chip injected under their skin to gain access to high-security areas. One anti-RFID crusader has called the technology the “mark of the beast,” as predicted in the Book of Revelation.

In the film Gattaca, set in the not-too-distant future, biometric surveillance is deployed to distinguish between genetically engineered superior humans and genetically natural inferior humans, who are forced to do menial jobs. We are quickly approaching such a world: employers who are able to identify—and create—workers with superior biological profiles are already turning the science fiction into reality.

Humanyze assures in corporate materials that privacy is a top priority. The names of employees are stored separately from behavioral information, and individual conversations aren’t recorded, just the metadata—a distinction familiar to those following the story of the widespread phone-tapping program of the NSA. Still, it requires little imagination to see how employers can use it for more extensive and rigorous surveillance of individual workers. A benign boss in the present may use data to decide the arrangements of break rooms and cubicles to enhance worker satisfaction and, in so doing, improve productivity. But in the future the same data may be retrieved and analyzed for unimagined possibilities. Observation is versatile in its application. In the face of capitalist demands for high performance and efficiency, abstract ideas like privacy and freedom can come to sound quaint and sentimental.

As optic and electronic watching give way to biosurveillance, the architecture of the Bentham brothers’ panopticon melts away and becomes internalized. The self-watching employee, under her own unwavering gaze, pre-adjusts
behavior according to a boss’ desire. Biosurveillance is sold as a tool for boosting happiness, but it also promotes a particular idea of what happiness is—which probably looks a lot more like workers who don’t make trouble than like squeaky wheels or even like the champions of disruption touted in Silicon Valley. The power to make you happy is also the power to define your happiness.

With his mantra “the medium is the message,” Marshall McLuhan stressed that the changes wrought upon us by technology may be more significant than the information revealed by it. Devices that monitor our minds and movements become part of who we are. Back in the Cold War, the Western press routinely derided Communist-bloc news clips of happy workers toiling away, singing songs in the mills and fields. One anti-communist propaganda animation from 1949, Meet King Joe, depicts a Chinese peasant smiling only because he is unaware of the paltriness and restrictions of his conditions. Such promos, perhaps, were just ahead of their time. Modern capitalism is poised to do them one better.

Secrets are rarely betrayed or discovered according to any program our fear has sketched out.
– George Eliot, 1860

Despite its name, a company like Humanyze—which brings forth the next frontier of biometric, device-driven surveillance—can make us less ourselves, more like who we’re supposed to be according to objectives of those who track our metrics. When we can feel, even on a cellular level, the gaze of the inspector, the invisible hand becomes the invisible eye, guiding as it does best, from within. Perhaps we will find true what we once feared: that contented workers are all alike. But so long as we are happy, who cares?

A pittance for Zika, $600 billion for the Pentagon


20 May 2016

As the Zika virus threatens a worldwide epidemic, and large areas of the United States are poised to be hit, the US Congress has yet to pass a bill authorizing the large sums needed to fight the virus and the diseases caused by it.

As the virus continues to spread, however, the US House voted on Wednesday to approve a $602 billion defense policy bill for the fiscal year beginning October to fund the US military. The bill must be reconciled with a version the Senate is expected to consider by the end of May.

Several months ago, the Obama administration requested $1.9 billion to combat Zika, a figure far below what is needed. The House on Wednesday passed a bill to provide $622 million (about one one-thousandth of the military budget) to control Zika, and requires that the funds be fully offset by cuts to other spending, particularly the Affordable Care Act.

The Senate voted on Thursday to pass its $1.1 billion version and proposed to add the cost to the deficit. President Obama has pledged to veto the House bill and has yet to comment on the Senate version.

All of these funding proposals are woefully inadequate to fight the threat of Zika in the US. They express the opposition of the entire political establishment to any serious steps against a virus that overwhelmingly affects the poor and vulnerable. The priority of the ruling class and its political representatives is not the protection and wellbeing of the vast majority of Americans, but funding the gigantic US military apparatus that is deployed throughout the world to prop up dictatorships and to maim and kill civilians.

The US Centers for Disease Control and Protection (CDC) confirmed in March that there was sufficient evidence to establish that the Zika virus causes microcephaly, a devastating defect in which infants are born with smaller than normal heads as their brains fail to properly develop. Zika is also thought to cause Guillain–Barré syndrome and other autoimmune conditions that are potentially fatal.

Contraction of Zika is more common in areas that lack sanitation and garbage collection, and have pools of standing water where the Aedes aegypti andAedes albopictus mosquito species that carry the virus can breed. Homes without window screens and bed netting are also at risk. The virus can also be sexually transmitted.

The Pan American Health Organization reported the first confirmed Zika virus infections in Brazil in 2015. About one million cases of Zika infection are now reported in Brazil, which is in the midst of a devastating economic crisis. The number of babies suspected and confirmed to have Zika-induced microcephaly is in the area of 5,000. The epicenter of the Zika crisis is in the country’s Northeast, where 35 million people have no running water and over 100 million lack access to sewage systems.

The CDC has reported mosquito-borne transmission of the Zika virus in Puerto Rico, the US Virgin Islands and American Samoa. Puerto Rico is reporting about 100 confirmed cases per week, and 945 infections since the island’s outbreak began last year, 65 of them pregnant women. Last Friday, the US territory’s health department reported the first fetus to develop microcephaly, which was not carried to term.

Puerto Rico defaulted on $347 million of its debt payments on May 2. Last year, the government cut $250 million in appropriations for public health, resulting in the closure of hospitals and health care centers and job losses for thousands of public employees. The default will further curb efforts to fight the spread of Zika.

The virus will undoubtedly move north, beginning with the US South. The National Center for Atmospheric Research (NCAR) looked at 50 US cities where the Aedes and related mosquito species are known to exist. NCAR assessed cities for Zika risk due to temperature, proximity to airports and overall socioeconomic conditions.

NCAR created a map showing potential areas for significant breeding of the Aedes mosquitos throughout the country. Five Florida cities have been identified as high-risk, and cities in Georgia, South Carolina and Alabama also have high-risk cities. Many of these areas have low access to air conditioning and windows with effective screens and greater difficulty accessing clean water.

Moderate risk for Zika has been identified in cities as far north as New York City, and as far west as Oklahoma City.

CDC Director Dr. Tom Frieden told ABC News that that there is a “narrow window of opportunity” to tackle the growing Zika threat. “This is an unprecedented problem,” he warned. “We’ve never had a situation before where a single mosquito bite could lead to a devastating fetal malformation.”

Politicians in Washington, however, are unmoved by the potential social catastrophe. The Obama administration’s efforts related to Zika include incentives for the drug companies, offering them expedited approval of new drugs in return for ramping up their research to develop a vaccine to protect against the virus. The pharmaceuticals have previously balked at doing such research, as it is not likely to bring in big profits.

The Zika virus and its horrifying effects, particularly on infants, are born of poverty and social inequality. They can be fought only on the basis of an internationally coordinated campaign, providing the resources to not only rapidly develop and distribute vaccines to fight it, but to eradicate the conditions of poverty and oppression that cause them to spread.

There are more than enough resources to be used to combat Zika and other modern-day plagues, but their utilization is blocked by the capitalist system, which subordinates all such concerns to the profits of a tiny financial oligarchy and its agenda of war abroad and social counterrevolution at home.

Kate Randall



Insurers set to sharply increase Obamacare premiums


By Kate Randall
26 April 2016

US health insurance companies are preparing to seek substantial increases in Obamacare premiums, the Hill reported Monday. Citing big losses on the Affordable Care Act marketplaces, many insurers will ask state insurance commissioners to approve double-digit hikes in ACA premiums and some may pull out of the market if they are not approved.

The planned premium hikes are a further exposure of the pro-corporate character of Barack Obama’s signature domestic legislation. Both candidates vying for the Democratic presidential nomination, Hillary Clinton and Bernie Sanders, have embraced the ACA as a supposedly progressive health care reform.

In reality, the ACA is designed to funnel increased profits to the private insurance companies. Under the law’s “individual mandate,” those without insurance through a government program or employer must obtain coverage offered by private insurers or pay a penalty. The insurers are demanding a hefty profit as the condition for their participation in the ACA marketplaces.

Many insurers say they have been losing money on Obamacare plans, in part due to setting their premiums too low when they started in 2014. “There are absolutely some carriers that are going to have to come in with some pretty significant price hikes to make up for the underpricing that they did before,” Sabrina Corlette of Georgetown University’s Center on Health Insurance Reforms told the Hill.

Qualified Health Plans (QHPs) under Obamacare must offer coverage regardless of any person’s preexisting health conditions, but are restricted from charging higher premiums based on health status and age. As a result, QHPs are more attractive to older, less healthy people and less attractive to younger, healthier people. With fewer young and healthy people enrolling in the ACA plans, the pool of clients is more costly to insure.

A study released Friday by the Mercatus Center at George Mason University found that insurance company losses from QHPs on the individual market were in excess of $2.2 billion. These losses came despite insurers receiving net reinsurance payments of $6.7 billion from the ACA’s reinsurance program, a program set up under the law to compensate insurers for large claims incurred by “high-risk individuals in the individual market.” That program is set to expire at the end of 2016.

A report from McKinsey & Company found that in the individual insurance market, which includes the ACA marketplaces, insurers were profitable in only nine states and lost money in 41. Larry Levitt, an expert on the ACA at the Kaiser Family Foundation, told the Hill, “Either insurers will drop out or insurers will raise premiums.”

Insurance analysts have warned that if more young, healthy people do not sign up for Obamacare coverage, the individual marketplace may collapse, with insurers pulling out in droves and sending the market into what is known in the insurance industry as a “death spiral.”

UnitedHealthcare, the largest single health carrier in the US, said in November that it was considering leaving Obamacare by 2017 due to financial losses. UnitedHealthcare’s definition of “losses” was the possibility of not seeing the same $1.6 billion in profits that it pocketed in the third quarter of 2015. Last week, the company announced it was dropping its ACA plans in Arkansas and Georgia and that more states could follow.

Blue Cross Blue Shield dropped its ACA plans in New Mexico last year after it lost money and state regulators rejected a proposed 51.6 percent premium increase. Blue Cross Blue Shield of North Carolina now says it may drop out of the ACA marketplace in that state due to losses.

News of the planned premium hikes follows a poll earlier this year showing widespread public dissatisfaction with the Affordable Care Act. Polling by National Public Radio and the Robert Wood Johnson Foundation found that more than a quarter of US adults say they have been personally harmed by the health care law since its passage.

Twenty-six percent of the poll’s 1,002 respondents said that the cost of health care has been a serious strain on their finances over the last two years. About 40 percent of those facing these financial struggles said they have spent all or most of their savings accounts on large bills, while 20 percent said they hadn’t filled prescriptions because they could not afford them.

Insurers seeking the premium hikes claim that the blow will be softened by the tax credit available to some low-income individuals and families who qualify under Obamacare. However, about 15 percent of Obamacare enrollees do not receive these subsidies, so they would bear the full burden of any premium increases.

Even with the subsidies, the overwhelming majority of the least expensive “bronze” Obamacare plans come with deductibles in excess of $5,000. This means that payment for all but certain “essential” services must be made out of pocket before any coverage kicks in. These costs are forcing many people to self-ration and go without needed medical care for themselves and family members.

Many of the bronze plans also offer extremely narrow networks, restricting access to doctors, hospitals and other providers. In an effort to boost their profits, insurers are expected to further restrict their provider networks in addition to raising premiums. The Mercatus Center study cited above found that QHPs in 2014 with narrow provider networks performed better financially than those with broader networks.



Death to America: Suicide Surge Parallels Era of Economic Woes

Published on

Across nearly all demographics, thirty-year trend shows increasing numbers of people living in the United States would rather not live at all

‘Many people view suicide as a mental health problem, but many people who die of suicide do not have a mental health problem,’ explains Kristin Holland of the CDC. Lack of accessible healthcare and economic anxiety believed to play major role in troubling trend. (Photo: Shutterstock)

Coinciding with growing income inequality, widespread economic stagnation, and a continued lack of basic health services during the same time period, a new federal report reveals a surging suicide rate among the U.S. population over the last three decades.

Published Friday by the National Center for Health Statistics, an arm of the Centers for Disease Control (CDC), the new report shows how—after a plateau in the 1980s and 90s—the suicide rate in the U.S. dramatically increased from 1999 to 2014, with the largest increase taking place after 2006. According to the CDC, suicide remains the 10th leading cause of death in the country.

“It’s a broad-based increase in suicide,” Sally Curtin, a statistician with the CDC and one of the report’s main authors, told the PBS Newshour.

Among the key statistics contained in the report:

  • From 1999 through 2014, the age-adjusted suicide rate in the United States increased 24%, from 10.5 to 13.0 per 100,000 population, with the pace of increase greater after 2006.
  • Suicide rates increased from 1999 through 2014 for both males and females and for all ages 10–74.
  • The percent increase in suicide rates for females was greatest for those aged 10–14, and for males, those aged 45–64.
  • The most frequent suicide method in 2014 for males involved the use of firearms (55.4%), while poisoning was the most frequent method for females (34.1%).

According to the Guardian, “This new suicide data underpins recent studies that showed a decline in life expectancy among middle-aged, white Americans – especially women. Such studies attributed the increasing death rate to drug and alcohol misuse, as well as suicide. However, the NCHS data did not analyze racial and ethnic differences in suicide.”

Though countless other studies have looked specifically at what drives individuals or various groups to suicide, Friday’s NCHS report focuses on tracking the overall statistics and demographics while leaving the cause behind the trends largely unexamined.

As the Guardian notes, however, “a 2013 analysis from the [CDC] noted the recent economic downturn and a vulnerability among baby boomers who had ‘unusually high suicide rates during their adolescent years’ as possible contributing factors to the rising suicide rate for middle-aged adults.”

And NPR reports how the national trend of “economic stagnation” during this same 30-year time period was leaving “more people out of jobs, and probably made it harder for people to access health care and treatment.”

In concert with growing economic inequality, job loss, and the growing burden of personal debt, NPR‘s report explains why lack of health insurance is a key factor because too often people who take their own life “weren’t covered or didn’t have access to treatment for depression, the most common risk factor for suicide.”

Kristin Holland, a behavioral scientist in the CDC’s Division of Violence Prevention who was not involved in the current research, told CNN that the “economic downturn” that took hold in the 2000s should certainly be considered a factor in the spiking rate.

“Many people view suicide as a mental health problem, but many people who die of suicide do not have a mental health problem. It’s a public health problem,” Holland said.

Bolstering the idea that financial factors play a key role in the surging suicide rate, anacademic review of existing peer-reviewed research published on Friday reveals a consistent correlation between individual stress and broader economic crisis. Conducted by researchers in Italy, the abstract of the paper notes:

In 2008 a deep economic crisis started in the US and rapidly spread around the world. The crisis severely affected the labor market and employees’ well-being. Hence, the aim of this work is to implement a systematic review of the principal studies that analyze the impact of the economic crisis on the health of workers. We conducted our search on the PubMed database, and a total of 19 articles were selected for review. All studies showed that the economic crisis was an important stressor that had a negative impact on workers’ mental health. Most of the studies documented that a rise in unemployment, increased workload, staff reduction, and wages reduction were linked to an increased rate of mood disorders, anxiety, depression, dysthymia, and suicide.

In Europe, where economic crises driven by the 2007 global economic collapse and the severe austerity measures that followed have caused widespread hardship, numerous studies in recent years (here, here, and here) have shown how suicide rate increases go hand in hand with the implementation of neoliberal reforms, including the cutting of wages and pensions as well as reductions of public services.

Last year, an article published in the American Journal of Preventive Medicine also revealed that suicide rates among middle-aged Americans (aged 40 to 64) had climbed since 1999, including the same drastic increase since 2007 at the outset of the financial crisis.

According to the authors of that study, “The sharpest increase in external circumstances appears to be temporally related to the worst years of the Great Recession, consistent with other work showing a link between deteriorating economic conditions and suicide.”


Life and Death in the Purple Box: Prince, What Happened?


Cover of Prince’s “LoveSexy”, Warner Bros. / Paisley Park, 1988.

Yesterday I understood for the first time how old-timers must have felt on the day that Elvis tumbled face-first off his porcelain throne almost 39 years ago. Told of Presley’s death, John Lennon—who had once said, “Before Elvis there was nothing”—reportedly shrugged it off with the kind of pitiless succinctness that only a bitterly disappointed lover could reach: “Elvis died when he went into the army.”

Is it too harsh—or too soon—to acknowledge that something like that happened to Prince too? Anyway that was my first reaction yesterday when my wife phoned and blurted out, in a voice mixed with incredulity, grief and anger, “Prince is dead.” The news was so shocking and unexpected that I had to ask her to repeat it. It still hasn’t really sunken in.

It was only later that I realized part of my problem in absorbing the fact of his death was that it’s seemed to me for many years that Prince died sometime around 1990. After that point he spent most of the next 20-plus years releasing indifferent-sounding music and filling arenas on the strength of his well-earned legend. Like Elvis, he continued to scatter the occasional gem in an interminable run of joyless, by-the-numbers music; there were still singles like “Get Off,” “Sexy Motherfucker,” and “Let It Go,” and you should check out the little-noted One Nite Alone… Live! two-disc set, if you can find it. (Worse, like Elvis–at least if TMZ’s sources are to be believed–he appears to have died from ingesting high-test prescription pharmaceuticals.) For reasons no one has been able to explain, the Prince I had followed and written about during the preceding decade–well, he just left the building.

The Prince we are all mourning now worked from 1978-1988. Barely past his teens at the outset, he spent that decade trashing every boundary of music and identity he encountered with a sense of joy, discovery, and complete self-assurance. During those years he released 10 albums and salted away enough material for God knows how many others. The Black Album, released belatedly in 1994, was recorded during those years, and I have tapes of a couple dozen additional vault tracks from 1985-87 that deserved release then and still do today. That body of work earned him a spot in a 20th Century American pantheon that includes not only the usually cited suspects (James Brown, Sly Stone)  but such virtuosic composers, players and bandleaders as Duke Ellington and Louis Armstrong. (And yes, that’s an all-black list–not in observance of the kind of racial boundaries Prince despised, but because I can’t really think of any white people who deserve to stand with them as musical pioneers.) He was that exciting. That germinal.

For me his last great work—perhaps his greatest—was the Lovesexy album and tour in 1988-89. A mammoth production on record and on stage, it was an electrifying, gloriously cluttered summation of everywhere he had traveled musically and emotionally in a decade’s worth of frenzied creativity.

It certainly didn’t feel like the end of anything, but in retrospect it was–the only conceivable title for any box set retrospective of his post-1990 records would have been PerFunkTory. For years I talked with friends and colleagues–Prince’s as well as my own–about what exactly happened. Our theories ran the gamut. One music-writer friend chalked it up to Prince’s mounting disappointment with the way his music was received by the critics he sniped about but read religiously. By this line of thinking, the confused, ambivalent reaction to Lovesexywas the last straw. Others said, well, his time was up: Who, among rock & roll era giants, has stayed at the top of his or her game as a performer and as a composer for more than 10 years? Nobody.

Both those observations make sense, but I always suspected that gravity simply caught up with Prince. Here was a freaky-talented young man who hatched a very personal vision of a musical community with no fences around matters of race, sexuality, musical style–that is, a place without any of the limitations he routinely encountered as a short, skinny, preternaturally ambitious black kid growing up in what was then the whitest major city in the country.

But another way to put that is to say that his vision was hatched in a state of profound isolation, and that it drew its power in large part from the desperation born of that isolation. How many of the Prince songs about sex, God, and the polymorphously perverse were really about loneliness at their emotional heart? Lots of them. Squint and you might say all of them. Go back and listen to “I Wanna Be Your Lover,” “If I Was Your Girlfriend,” “Anna Stesia,” even “When Doves Cry.” You’ll hear lines like,

I wanna be your brother
I wanna be your mother and your sister, too


If I was your girlfriend
Would U remember 2 tell me all the things U forgot
When I was your man?


Have you ever been so lonely
That you felt like you were the
Only one in this world?

Have you ever wanted to play
With someone so much you’d take
Any one boy or girl?

I would submit that none of this is about gender-bending, or sex, or even pleasure, for its own sake; it’s about trying to escape the sort of desperate, terminal solitude from which he came and to which, in the end, he seemed to return. Last night they closed the street in front of First Avenue, the Minneapolis club made famous by the Purple Rainmovie, and a huge crowd danced there through the night. Seeing the pictures at a local news site this morning reminded me of a story that one Prince insider told me years ago about one of his private birthday bashes. Everyone there was given a Prince mask and asked to put it on. Soon the entire ballroom was choked with Princes, but Prince himself was nowhere in sight. He was lurking by himself in an alcove above the crowd, my source told me, watching his guests dance without him. And there he stayed until he left the party.

And this is about as close to his own utopian vision of community as Prince the man ever allowed himself to get. So yeah, his death was shocking. It was sad. But from here, in these first numbing hours, I keep thinking–please forgive me–that it was his life and not his death that constituted the more profound tragedy. He deserved better. So did Elvis. So do we all.

Beginning in 1984, Minneapolis writer Steve Perry covered Prince’s music and career in the Twin Cities weekly City Pages and elsewhere.  



Life expectancy declines for white Americans


21 April 2016

It seems that each week brings new information documenting the precipitous decline in the conditions of life for a large majority of the American people. Yet in all of the media commentary on the mood of anger expressed in the convulsive 2016 election campaign, little is said about the profound and worsening social crisis that is fueling it.

On Wednesday came a report from the Centers for Disease Control and Prevention (CDC) analyzing data on deaths recorded in the United Sates in 2014. The CDC found that life expectancy for whites has begun to drop, with a more pronounced decline for women than for men.

The actual year-on-year change from 2013 to 2014 may seem small, with life expectancy at birth for whites falling from 78.9 years to 78.8 years, but the direction of the change is in and of itself shocking. As Dr. Elizabeth Arias, author of the report, notes, “The trend in life expectancy at birth has been one of improvement since national estimates were first published with 1900 data.”

After more than a century of rising life expectancy, interrupted only briefly by World War II and the worst year of the AIDS epidemic, life expectancy for whites remained constant in 2012 and 2013, and then declined in 2014. Life expectancy for the entire US population remained unchanged because there was a slight improvement for African-Americans, Hispanics and other minorities.

Dr. Arias told the New York Times that the decline in life expectancy was largely the result of increased death rates for white men and women from their mid-20s to their mid-50s, the prime years of adulthood, when death rates are typically quite low. “The increase in death in this segment of the population was great enough to affect life expectancy at birth for the whole group,” she said. “That is very unusual.”

While categorized in racial terms by the study, what is expressed in such figures is the consequences of class warfare. Other studies have shown a shocking divergence in life expectancy between poorer and wealthier Americans. The impact of decades of deindustrialization, and the social ills produced by it, is reflected in one of the most basic indicators of social well-being.

Drug overdoses, liver disease (much of it a byproduct of alcoholism and drug abuse) and suicide are the main causes of these premature deaths. Addiction to prescription opioids like OxyContin is a major factor: Americans comprise 5 percent of the world’s population but consume 80 percent of prescription opioids.

According to an analysis of health data by the Washington Post published April 10, the death rate for rural white women in their 40s has risen by 30 percent since 2000, and by nearly 50 percent since 1990. In 30 counties in the rural South, middle-aged white women now have a higher mortality rate than black women of the same age.

Summing up the overall dimensions of the crisis in life expectancy, the Postwrote: “Compared with a scenario in which mortality rates for whites continued to fall steadily after 1998, roughly 650,000 people have died prematurely since 1999—around 450,000 men and nearly 200,000 women. That number nearly equals the death toll of the American Civil War.”

The contradictions of American capitalism find expression not solely in the dismal indices of declining social well-being, but also in an increasingly militant, angry and politically radical mood in the working class. Vulnerable individuals may fall victim to social evils like drug abuse and suicide, but the class as a whole will seek to find a way out of the crisis on the road of struggle.

Those struggles that have broken out over the past year have immediately come into conflict with the trade unions, which function as part of the police apparatus of corporate America for suppressing the working class. From the oil workers’ strike of early 2015 to the rank-and-file rejection by autoworkers of sellout contracts accepted by the UAW to the current protests and strikes by teachers and the Verizon walkout, workers are confronting the necessity of breaking through the straitjacket of the unions and their alliance with the Democratic Party, and adopting a new political perspective.

In the 2016 presidential campaign, the social anger has been reflected in support for nominally anti-establishment candidates in both big business parties—the real estate mogul Donald Trump on the right and, more broadly, the self-described “democratic socialist” Bernie Sanders on the left.

Sanders presents himself as an opponent of Wall Street greed and the “billionaire class,” but his main purpose is to divert the growth of anti-capitalist sentiment in the working class and among young people back into the suffocating confines of the Democratic Party. He has pledged to back Hillary Clinton if and when she wins the nomination.

While the outcome of the primary campaign is not settled, the New York primaries have increased the chances that Trump will secure the Republican nomination and Clinton will win the Democratic race. This “choice” is itself a demonstration of the complete dead end of the capitalist two-party system.

Clinton—the former first lady, senator from New York and secretary of state—is the personification of the corrupt status quo. Together with her husband, she has raked in $140 million since leaving the White House in 2001.

Clinton’s likely opponent is a billionaire demagogue who became a celebrity through real estate manipulations, casinos and a reality TV show, and now engages in fascistic rants against immigrants, Muslims and women, while encouraging violence against those who protest against his racist rhetoric. Trump’s main contender, Texas Senator Ted Cruz, meanwhile, is a favorite of the Tea Party and the Christian right, who advocates the privatization of Social Security and the carpet bombing of Iraq and Syria.

None of these candidates can offer any policies to address the catastrophic conditions affecting ever broader sections of the American people. These conditions are rooted in the insoluble crisis of American and world capitalism, which both parties defend. Regardless of who wins the election in November, the ensuing months will see an intensification of austerity and attacks on democratic rights at home and militarism and war abroad.

The initial growth of social opposition and working class consciousness reflected in the elections underscores the urgent need for a genuine socialist and internationalist perspective to guide the coming struggles of the working class.

Patrick Martin




Get every new post delivered to your Inbox.

Join 1,746 other followers