Unemployment up in most US states

http://www.thunder1320.com/wp-content/uploads/2014/05/unemployment_2.jpg

By Nick Barrickman
20 August 2014

The unemployment rate rose from June to July in 30 out of 51 states, the Bureau of Labor Statistics (BLS) reported on Monday.

The only states experiencing substantial changes were those that saw growth in the number of unemployed or those applying for unemployment benefits. These states were mainly located in the Southeast and mid-Atlantic, including Tennessee, South Carolina, Georgia, Maryland, Mississippi and West Virginia. The state of Mississippi possessed the highest overall unemployment rate in the country at 8 percent. Twelve states saw no appreciable change.

The BLS also reported that the number of jobless youth rose by 913,000 from April to July, significantly higher than the 692,000 added to unemployment rolls during the same time last year. Historically, the April-July “summer jobs” season is a peak hiring period for youth aged 16-24.

In absolute numbers, Ohio, Maryland, and South Carolina led the nation in net job losses, with 12,400; 9,000; and 4,600 positions shed, respectively. Eight states and the District of Columbia currently have unemployment rates higher than the national rate of 6.2 percent, while another 24 have rates not differing significantly from the national average.

The region with the highest unemployment rate was the West Coast, with 6.6 percent. Only .7 percentage points separated it from the region with the lowest official jobless rate, the Midwest, at 5.9 percent.

The concentration of job losses in the mid-Atlantic region is politically significant. Home to the Washington DC region, the US mid-Atlantic area is often referred to as “recession-proof,” a reference to its proximity to and reliance upon the federal government. “Maryland, Virginia and D.C. really suffered from the cutbacks in federal spending, and that seems like old news, but it’s a cumulative effect… So stuff that started years ago keeps showing up in the unemployment numbers, and we keep seeing less activity,” Stephen Fuller, director of George Mason University’s Center for Regional Analysis told the Washington Post, in reference to the budget cuts known as the “sequester,” implemented by the Obama administration last year.

The BLS and a number of media outlets have noted that despite the gloomy month-to-month numbers, the national unemployment average is 1.1 percent lower than the same period last year.

In fact, the decline in the official jobless rate has largely been driven by workers falling out of the workforce. At 62.9 percent, the labor participation rate—the number of adults gainfully employed or seeking employment—remains at a rate far below the 67.2 percent registered in 2000.

Jobs actually created in the “recovery” have been heavily concentrated in the low-paying service sector. The National Employment Law Project (NELP) reported this spring that since the financial crash of 2008, 1.9 million high- and average-paying jobs in the private sector have been eliminated and replaced with 1.8 million low-wage jobs. This finding was later backed up by a study released by the US Conference of Mayors showing that the average job lost in 2008–2009 paid $61,637 while the typical job created after this period paid only $47,171—a 23 percent drop in pay.

The gains made in a number of states may also prove to be illusory. The Post notes that “[t]he reason the [unemployment] rate rose nationwide even as hiring increased is that more Americans launched job searches but didn’t find work.”

Ferguson: Generations Of Economic Inequality Fuel Violent Protests

Freguson Protest March
Ferguson, MO. Feb.16 2014. Protesters march through town in a peaceful protest to mark a week since police killed Michael Brown. While the narrative is focused on social justice, many residents say economics are almost as important. Kathleen Caulderwood

Ferguson, MO — The racial tensions that sparked violent clashes between residents and police in the wake of the shooting death of Michael Brown in the small St. Louis suburb of Ferguson, Missouri, also are evident in the economic inequalities that separate whites from blacks. In one of the most segregated cities in the country, where a mostly white police force serves a majority-black population, black residents of Ferguson are far more likely to be unemployed than their white neighbors. That disparity and economic hopelessness have helped fuel the frustrations of residents rallying in the streets for a more just community.

“The barriers are invisible. You can’t see it, but they’re there,” said Darren Seals, 27, who lives a few blocks from the intersection where police and protesters faced off Friday night. He was unemployed for three years before recently landing a job at General Motors, and is the only man in his family with regular work, he said.

His situation isn’t unique. Census figures for 2012 in St. Louis County, which includes Ferguson, show 47 percent of African-American men are unemployed while the same rate for white men is just 16 percent.

“What people don’t realize right away is that the greatest disparities are at the economic level,” said Judy Ferguson Shaw, 64, who has lived in Ferguson for 54 years.

“A lot of people assume it’s just a bunch of guys going crazy, but really it’s something that’s been brewing for generations,” said John Knowles, 25, who is white, works two jobs and has lived in Ferguson for 15 years. “There’s no work around here at all. It’s especially bad for the black population because they tend to live in lower-income areas to begin with.”

In the census tract where Brown, 18, lived, median income is less than $27,000 and just half of the adults are employed. Brown, a black 18-year-old student, was killed by a white police officer last week, prompting days of protests and looting in Ferguson.

Shernale Humphrey, 21, who works at the McDonald’s on West Florissant, where protesters have been gathering nightly, said she thinks education is also a factor. “A lot of black families aren’t in good school areas, so that makes it hard,” she said.

Humphrey said she participated in the national fast food workers’ strikes last year aimed at raising the minimum wage because “I had to do something. We need to make this community better for the next generation.”

Until the 1940s, black people were barred from buying houses in many parts of St. Louis County, which kept many parts of cities like Ferguson almost exclusively white. But in the 1980s white families left the area to move to further suburbs, clearing the way for African-American families.

Since 2000, the median household income has fallen by 30 percent, and the city is still struggling after the recession, which left few jobs and increasing frustration.

“I can’t help but think the anger was intensified by a feeling of powerlessness, and it’s hard to ignore the ways in which the economy of the past few years has been hard on the north St. Louis County,” St. Louis Post-Dispatch reporter David Nicklaus wrote  Sunday.  “If police tactics were the spark that set off the explosion in Ferguson this week, then poverty and hopelessness were the tinder.”

http://www.ibtimes.com/ferguson-generations-economic-inequality-fuel-violent-protests-1660724?ft=3aj78&utm_content=apollohelios@sonic.net&utm_medium=Aug_18_2014_0401_196283&utm_source=TailoredMail&utm_term=The+Economic+Undercurrent+Of+Ferguson%27s+Protests&utm_campaign=Aug_18_2014_0401

Face Time: Eternal Youth Has Become a Growth Industry in Silicon Valley

Tuesday, Aug 12 2014

The students of Timothy Draper’s University of Heroes shuffle into a conference room, khaki shorts swishing against their knees, flip-flops clacking against the carpeted floor. One by one they take their seats and crack open their laptops, training their eyes on Facebook home pages or psychedelic screen savers. An air conditioner whirs somewhere in the rafters. A man in chinos stands before them.

The man is Steve Westly, former state controller, prominent venture capitalist, 57-year-old baron of Silicon Valley. He smiles at the group with all the sheepishness of a student preparing for show-and-tell. He promises to be brief.

“People your age are changing the world,” Westly tells the students, providing his own list of great historical innovators: Napoleon, Jesus, Zuckerberg, Larry, Sergey. “It’s almost never people my age,” he adds.

Students at Draper University — a private, residential tech boot camp launched by venture capitalist Timothy Draper, in what was formerly San Mateo’s Benjamin Franklin Hotel — have already embraced Westly’s words as a credo. They inhabit a world where success and greatness seem to hover within arm’s reach. A small handful of those who complete the six-week, $9,500 residential program might get a chance to join Draper’s business incubator; an even smaller handful might eventually get desks at an accelerator run by Draper’s son, Adam. It’s a different kind of meritocracy than Westly braved, pursuing an MBA at Stanford in the early ’80s. At Draper University, heroism is merchandised, rather than earned. A 20-year-old with bright eyes and deep pockets (or a parent who can front the tuition) has no reason to think he won’t be the next big thing.

This is the dogma that glues Silicon Valley together. Young employees are plucked out of high school, college-aged interns trade their frat houses and dorm rooms for luxurious corporate housing. Twenty-seven-year-old CEOs inspire their workers with snappy jingles about moving fast and breaking things. Entrepreneurs pitch their business plans in slangy, tech-oriented patois.

Gone are the days of the “company man” who spends 30 years ascending the ranks in a single corporation. Having an Ivy League pedigree and a Brooks Brothers suit is no longer as important.

“Let’s face it: The days of the ‘gold watch’ are over,” 25-year-old writer David Burstein says. “The average millennial is expected to have several jobs by the time he turns 38.”

Yet if constant change is the new normal, then older workers have a much harder time keeping up. The Steve Westlys of the world are fading into management positions. Older engineers are staying on the back-end, working on system administration or architecture, rather than serving as the driving force of a company.

“If you lost your job, it might be hard to find something similar,” a former Google contractor says, noting that an older engineer might have to settle for something with a lower salary, or even switch fields. The contractor says he knows a man who graduated from Western New England University in the 1970s with a degree in the somewhat archaic field of time-motion engineering. That engineer wound up working at Walmart.

Those who do worm their way into the Valley workforce often have a rough adjustment. The former contractor, who is in his 40s, says he was often the oldest person commuting from San Francisco to Mountain View on a Google bus. And he adhered to a different schedule: Wake up at 4:50 a.m., get out the door by 6:20, catch the first coach home at 4:30 p.m. to be home for a family supper. He was one of the few people who didn’t take advantage of the free campus gyms or gourmet cafeteria dinners or on-site showers. He couldn’t hew to a live-at-work lifestyle.

And compared to other middle-aged workers, he had it easy.

In a lawsuit filed in San Francisco Superior Court in July, former Twitter employee Peter H. Taylor claims he was canned because of his age, despite performing his duties in “an exemplary manner.” Taylor, who was 57 at the time of his termination in September of last year, says his supervisor made at least one derogatory remark about his age, and that the company refused to accommodate his disabilities following a bout with kidney stones. He says he was ultimately replaced by several employees in their 20s and 30s. A Twitter spokesman says the lawsuit is without merit and that the company will “vigorously” defend itself.

The case is not without precedent. Computer scientist Brian Reid lobbed a similar complaint against Google in 2004, claiming co-workers called him an “old man” and an “old fuddy-duddy,” and routinely told him he was not a “cultural fit” for the company. Reid was 54 at the time he filed the complaint; he settled for an undisclosed amount of money.

What is surprising, perhaps, is that a 57-year-old man was employed at Twitter at all. “Look, Twitter has no 50-year-old employees,” the former Google contractor says, smirking. “By the time these [Silicon Valley] engineers are in their 40s, they’re old — they have houses, boats, stock options, mistresses. They drive to work in Chevy Volts.”

There’s definitely a swath of Valley nouveau riche who reap millions in their 20s and 30s, and who are able to cash out and retire by age 40. But that’s a minority of the population. The reality, for most people, is that most startups fail, most corporations downsize, and most workforces churn. Switching jobs every two or three years might be the norm, but it’s a lot easier to do when you’re 25 than when you’re 39. At that point, you’re essentially a senior citizen, San Francisco botox surgeon Seth Matarasso says.

“I have a friend who lived in Chicago and came back to Silicon Valley at age 38,” Matarasso recalls. “And he said, ‘I feel like a grandfather — in Chicago I just feel my age.”

Retirement isn’t an option for the average middle-aged worker, and even the elites — people like Westly, who were once themselves wunderkinds — find themselves in an awkward position when they hit their 50s, pandering to audiences that may have no sense of what came before. The diehards still work well past their Valley expiration date, but then survival becomes a job unto itself. Sometimes it means taking lower-pay contract work, or answering to a much younger supervisor, or seeking workplace protection in court.

CONTINUED: http://www.sfweekly.com/sanfrancisco/silicon-valley-bottom-age-discrimination/Content?oid=3079530

Obama’s “recovery” and the social crisis in America

http://iliketowastemytime.com/sites/default/files/barack-obama-faces-emotion22.jpg

12 August 2014

“The good news is the economy clearly is getting stronger… Things are getting better.”
– President Barack Obama at an August 1 White House press conference

“I’m glad that GDP is growing, and I’m glad that corporate profits are high, and I’m glad that the stock market is booming.”
– President Barack Obama, speaking July 30 in Kansas City, Missouri

So speaks Barack Obama, for whom economic health is measured principally by the stock portfolios and bank accounts of the financial elite he represents. For a large majority of the population, however, conditions of life are, if anything, worse than they were at the height of the 2008-2009 financial crisis.

A recent report from the US Federal Reserve captures something of this social reality. According to the Fed’s survey, 70 percent of American households say they are no better off now than they were in 2008.

The survey also found that a typical American adult cannot raise $400 without borrowing or selling something. So fragile is the economic position of most households, an unexpected car problem, medical issue or legal entanglement can push them over the edge.

Additional findings by the Fed include:

  • Most people who have student loans for their own education cannot make payments on time. Eighteen percent are either behind on their payments or are being pursued by a collection agency, and 34 percent are in deferment or forbearance.
  • Nearly two thirds of the population do not have funds set aside to cover their expenses for three months.

For the super-rich, the “recovery” has been very real indeed. The Dow Jones Industrial Average and S&P 500 stock indexes have hit record highs, while the NASDAQ has more than tripled since its post-2008 lows.

Ostentatious displays of wealth are more and more the norm. Some among the super-rich have put their newfound wealth to use commissioning the construction of what the Wall Street Journal calls “vertical mansions” in New York City. One such building, 520 Park Ave, now under construction on the South side of Central Park, will feature 31 apartments, the smallest of which “will sprawl over full floors of about 4,600 square feet.” The Journal adds, “They will each have their own elevator landings and are set to list at $27 million or more.”

The largest apartment, “a 12,400-square-foot triplex with a private terrace… priced at as much as $10,000 per square foot,” will be the city’s most expensive apartment, priced at “considerably more than $100 million.”

Obama, a millionaire many times over, coddled since his youth by the forces of the state, may not find it hard to convince himself that the economic recovery he keeps talking about is real. His main social constituency, the ultra-rich, has more money than it knows what to do with.

The further enrichment of this social layer has been the overriding aim of his administration and the bipartisan agenda of both big business parties—the Democrats no less than the Republicans. It is the thread that runs through the multi-trillion dollar bank bailout, the wage cuts imposed as part of the government-orchestrated bankruptcy of GM and Chrysler, the money printing policies of the Fed, and the cuts to food stamps, unemployment benefits and other social programs.

Statistics presented in a report by the Conference of Mayors Friday show the results of this process. The report notes that while a typical job lost in the 2008-2009 recession paid $61,637 a year, a typical job added during the “recovery” paid $47,171—a 23 percent drop in wages.

This confirms an earlier study by the National Employment Law Project, which found that since the financial crash of 2008, 1.9 million high- and average-paying jobs in the private sector have been eliminated and replaced with 1.8 million low-wage jobs.

The mayors’ report adds that the top 20 percent of all households captured more than 60 percent of all income gains between 2005 and 2012, with the richest in this layer getting the bulk of the increase. Meanwhile, in many cities throughout the country, more than half the population earns less than $35,000 a year.

Adjusted for inflation, median household income dropped by 5.5 percent between 2005 and 2012. And compared to 1968, annual income based on the federal minimum wage has fallen by 32 percent in real terms, from $22,235 to $15,080.

This is the social reality confronting the working class—mass unemployment, reflected most clearly in the decline in the employment-to-population ratio, low-wages, economic insecurity and permanent indebtedness.

For the last five years, vast resources have been transferred to the super-rich. The main mechanism for carrying this out—the inflation of the stock market and other financial assets—has only created the conditions for a new and even greater economic crisis, with even more devastating consequences.

Behind the facade of official politics, the United States is a social powder keg. The eruption of anger in the wake of the police killing of an unarmed youth in St. Louis, Missouri over the past several days is only a small indication of social tensions in America and an initial expression the indignation felt by broad sections of the working class.

This anger must be armed with a political perspective—aimed at mobilizing the entire working class in opposition to the capitalist system, the root cause of social inequality, and replacing it with a socialist system based on the satisfaction of social needs, not private profit.

Andre Damon

Ayn Rand-loving CEO destroys his empire

The invisible hand waves bye-bye to Eddie Lampert, whose business plan has run Sears into the ground

Ayn Rand-loving CEO destroys his empire
This article originally appeared on AlterNet.

AlterNet Once upon a time, hedge fund manager Eddie Lampert was living a Wall Street fairy tale. His fairy godmother was Ayn Rand, the dashing diva of free-market ideology whose quirky economic notions would transform him into a glamorous business hero.

For a while, it seemed to work like a charm. Pundits called him the “Steve Jobs of the investment world.” The new Warren Buffett. By 2006 he was flying high, the richest man in Connecticut, managing over $15 billion thorough his hedge fund, ESL Investments.

Stoked by his Wall Street success, Lampert plunged headlong into the retail world. Undaunted by his lack of industry experience and hailed a genius, Lampert boldly pushed to merge Kmart and Sears with a layoff and cost-cutting strategy that would, he promised, send profits into the stratosphere. Meanwhile the hotshot threw cash around like an oil sheikh, buying a $40 million pad in Florida’s Biscayne Bay, a record even for that star-studded county.

Fast-forward to 2013: The fairy tale has become a nightmare.

Lampert is now known as one of the worst CEOs in America — the man who flushed Sears down the toilet with his demented management style and harebrained approach to retail. Sears stock is tanking. His hedge fun is down 40 percent, and the business press has turned from praising Lampert’s genius towatching gleefully as his ship sinks. Investors are running from “Crazy Eddie” like the plague.

That’s what happens when Ayn Rand is the basis for your business plan.

Crazy Eddie has been one of America’s most vocal advocates of discredited free-market economics, so obsessed with Ayn Rand he could rattle off memorized passages of her novels. As Mina Kimes explained in a fascinating profile in Bloomberg Businessweek, Lampert took the myth that humans perform best when acting selfishly as gospel, pitting Sears company managers against each other in a kind of Lord of the Flies death match. This, he believed, would cause them to act rationally and boost performance.



If you think that sounds batshit crazy, congratulations. You understand more than most of America’s business school graduates.

Instead of enhancing Sears’ bottom line, the heads of various divisions began to undermine each other and fight tooth and claw for the profits of their individual fiefdoms at the expense of the overall brand. By this time Crazy Eddie was completely in thrall to his own bloated ego, and fancied he could bend underlings to his will by putting them through humiliating rituals, like annual conference calls in which unit managers were forced to bow and scrape for money and resources. But the chaos only grew.

Lampert took to hiding behind a pen name and spying on and goading employees through an internal social network. He became obsessed with technology, wasting resources on developing apps as Sears’ physical stores became dilapidated and filthy. Instead of investing in workers and developing useful products, he sold off valuable real estate, shuttered stores, and engineered stock buybacks in order to manipulate stock prices and line his own pockets.

Eddie’s crazy didn’t stop there. As a Wall Street creature fantastically out of touch with the kind of ordinary folks who shop at Sears, he inserted his love of luxury into the mix, trying to sell Rolex watches and $4,400 designer handbagsthrough America’s iconic budget-friendly brand.

As his company was descending into Randian mayhem, Lampert continued to cheerfully inform stockholders that his revolutionary ideas would soon produce earth-shattering results. Reality: Sears has lost half its value in five years. Since 2010, Sears has closed more than half of its stores. Sears Holdings is financially distressed and Lampert’s own hedge fund has reduced its stake in the company. The Sears store in Oakland, California, open for business with boarded-up windows, has even been cited for urban blight.

Truth be told, hedge fund honchos have had little to fear from royally screwing companies. Bank accounts fattened at the expense of workers and other stakeholders, they go on their merry way to mess up something else. But the epic incompetence of guys like Lampert may be dispelling the myth that financiers are the smartest guys in the room. Research suggests that not only do hedge fund managers typically understand squat about running a company, they’re often not much good at beating the stock market, either. A recent Bloomberg article points out that in 2013, hedge funds returned 7.1 percent. That doesn’t sound so bad, until you consider that if you had just stuck your money in the Standard & Poor’s 500 Index you would have seen returns of 29.1 percent. Big difference!

While Lampert was caught up in Randian delusions of crass materialism and cut-throat capitalism, he failed to realize that a business is an experience as much communal as it is individual. Employees are not just competitive beings — they benefit from cooperating with each other and perform better when they are respected, rather than beaten down and driven by fear.

Slowly but surely, Ayn Rand’s economic theories are being discarded because they simply don’t add up in the real world. Even Rand acolyte Paul Ryan (R-Wis) is now distancing himself, calling his well-documented enthusiasm an “urban legend.”

Lampert created a business model predicated on the notion that the invisible hand of the market would magically drive stellar results. With his belief in economic fairy tales, he managed to kill the goose that laid his own golden egg.

Looks like the invisible hand just waved goodbye to Eddie Lampert.

 

Lynn Parramore is an AlterNet contributing editor. She is co-founder of Recessionwire, founding editor of New Deal 2.0, and author of “Reading the Sphinx: Ancient Egypt in Nineteenth-Century Literary Culture.” Follow her on Twitter @LynnParramore.

http://www.salon.com/2013/12/10/ayn_rand_loving_ceo_destroys_his_empire_partner/

Obama and the Revival of Wall Street

Obama Gets a (Political) Facelift

http://scottystarnes.files.wordpress.com/2011/10/alg_obama_wall_st13.jpg?w=640

 

by ROB URIE

 

Ahead of the upcoming mid-term elections there is an argument being put forward by Democrat apologists that President Barack Obama’s resuscitation of Wall Street at public expense and his wholly inadequate ACA (Affordable Care Act) are policy successes worthy of political support. The base tactic is to set up false choices and then argue that Mr. Obama chose among the better of the available alternatives. His unconditional resurrection of Wall Street is posed against an inevitable second Great Depression when the choice was between reviving the existing system of suicide finance or reining it in to serve a public purpose. And the problem in need of solving with the American health care system is a lack of access to health care for a substantial portion of the population. Increasing the intrusion of the insurance industry into the U.S. health care system, as the ACA does, builds an even higher wall between people and access to health care.

The base frame in support of the revival of Wall Street is that Mr. Obama had a choice of Great Depression 2.0 or the wholesale revival of Wall Street, however ‘distasteful’ the latter might have been. Left unsaid is that it is this very same Wall Street that destroyed the economies of the West through creating the housing boom / bust, that the only thing resolved since the onset of crisis is the incomes and wealth of the very richest, that the Federal government held all of the cards in 2008 – 2009 when Wall Street was at risk of collapse and that Mr. Obama put the very same Timothy Geithner and Larry Summers who promoted the bank interests that created the crisis in charge of covering it up. Nationalization of major Wall Street banks was put forward as a policy option in top-level discussions of resolving the crisis. The argument now being made (top link) that government agencies lacked the legal authority to do so ignores the practical circumstance that Wall Street was dependent on government largesse to avoid wholesale bankruptcy in 2008 – 2009 and that the ‘lacked legal authority’ argument was no where to be found in contemporaneous policy discussions.

urieobama1

Graph (1) above: both income and wealth dispersion have worsened under Mr. Obama’s tenure as President. Illustrated are the rise in relative (to the median) wealth of the very richest and the decline in relative wealth of the poorest. While the trend toward income and wealth inequality preceded Mr. Obama’s time in office, his policies have overwhelmingly benefited the very rich and left everyone else to their own devices. Rather than simply giving bailout funds to Wall Street the money could have been used to buy down negative equity for those duped into taking predatory loans to buy houses. Likewise the purchase by the Federal Reserve of $ 4 trillion in assets has overwhelmingly benefited the very richest by raising financial asset prices. Source: Russell Sage Foundation.

To be clear, the Obama administration’s choice of what to do with Wall Street wasn’t— was not, between consequence free bailouts and another Great Depression. And in fact the IMF (International Monetary Fund) spent the previous half-century asserting this very point when it came to closing dysfunctional banks overseas. Predatory banks draw economic resources away from well-functioning parts of national economies under nearly all theories of banking and finance. Claims that Obama administration policies (Dodd-Frank) have ‘reformed’ the financial system ignore global financial interconnectedness, multiple commitments of limited collateral and that not even government regulators believe that too-big-to-fail has been resolved.  A relevant question is where the economic resources will come from to resolve the $1 – $2 quadrillion in Swaps notional and other derivatives contracts that take precedence over all other claims, including depositor’s, in bankruptcy? And the contention that government subsidies of Wall Street have now ended demonstrates near complete ignorance of how financial markets function.

urieobama2

Graph (2) above: one of the more spurious contentions Democrat loyalists put forward is that current ‘calm’ in the form of rising stock markets and house prices amidst ongoing depression for a substantial proportion of the population indicates that Obama administration policies resolved anything? The current circumstance of high stock prices (relative to cyclically adjusted corporate earnings) and tight (low) credit spreads has been a sign of impending turmoil in the past. For statistical reasons (non-stationary credit spreads), the way to view the too-big-to-fail subsidy of Wall Street is over boom / bust cycles and not at a cyclical low point for credit spreads. Claiming that current conditions answer the question would look quite different if the ‘current condition’ was 2008 – 2009. And in fact, actual economic conditions for most Americans have continued to deteriorate as stock and housing market prices driven higher by Federal Reserve policies have revived the fortunes of the very richest. Source: St. Louis Fed, Moody’s.

urieobama3

Graph (3) above: Mr. Obama’s supporters claim that were it not for his consequence-free bailouts of Wall Street a second Great Depression would have occurred. What is illustrated above are the declines in real (inflation-adjusted) median household incomes and wealth both before and after Mr. Obama took office. For the bottom half of the population both income and wealth continued their declines after Mr. Obama took office.. This isn’t to claim that the full cause of the declines lies with Mr. Obama’s policies— former Democrat President Bill Clinton is substantially responsible for the catastrophic deregulation that led to financial calamity. That Mr. Obama appointed the chief architects of this calamity, Timothy Geithner and Larry Summers, to head the bailout efforts for his administration goes far in explaining why only bankers and the very rich have benefited from his policies. Sources: St. Louis Fed and Russell Sage Foundation.

The issues around the ACA (Affordable Care Act) are just as pernicious but in many ways of more direct relevance to most people. A recent article in the New York Times made the point that there is a large ‘learning curve’ to effectively navigate Obamacare coverage. The article focused on Philadelphia but the issues raised are relevant nationwide. For the uninitiated, Pennsylvania is one of twenty-four states that declined to expand Medicaid coverage under Obamacare. Remember: Obamacare was sold as the ‘politically feasible’ alternative to (national) single-payer health care. With Medicaid being the only health care program available to the poor and very poor, the refusal of half of the states to expand the program suggests that settling for the ‘political feasible’ choice was a very poor political calculation if expanding access to health care was the goal. The practical impact is that a substantial portion of poor and near poor across the country will see no increased access to health care under Obamacare.

The ‘learning curve’ at issue is the ins and outs of Obamacare insurance coverage— premiums, subsidies, co-payments, deductibles and out-of-pocket expenses, along with the intricacies of health care networks where health care providers may deal with hundreds of different insurers and know very little of what is covered by any specific insurance plan. In an effort to minimize costs insurers are creating ‘narrow networks’ that limit who provides covered health care— e.g. specific hospitals, doctors and labs. For example: if (non-emergency) surgery is needed the person having the surgery must make sure that the hospital where the surgery is to take place is ‘in network,’ that all doctors involved in any aspect of the surgery are in network, that all diagnostic tests are done through ‘in network’ labs and that all drugs prescribed are ‘approved’ within the network. Failure to know any and all of these details and to make sure that everyone involved— hospital employees, doctors, nurses and administrators, both understand and act on policy limitations, will result in bills for medical services that the people Obamacare was nominally designed to serve can’t afford to pay. Additionally, if you become sick and lose your income you either go on Medicaid if you live in one of the twenty-six states that expanded Medicaid coverage or you are on your own—no matter how many years you have been paying insurance premiums for.

Here Wall Street and Obamacare start to come together. The issue of the complexification of health care, forcing people to know and to competently navigate every aspect of insurance contracts, medical consultation and health care provision or suffer adverse consequences, is related to Wall Street strategies of issuing mortgages that only those with a Ph.D. in math and a lot of time to waste on contractual minutiae can understand. The variable rate mortgages of the housing boom / bust were sold as ‘affordability products,’ as an accommodation to borrowers for their (the borrower’s) benefit. What they were is age-old predatory lending. The most complicated mortgages were issued to the least sophisticated borrowers. In the case of Obamacare, complexification works in the interests of insurers. The more difficult it is for the insured to know what costs they are ‘responsible’ for the easier it is for insurers to force the costs of health care onto them. And even if one assumes honest motives, forcing people to devote their lives to the minutiae of health insurance policies is a uniquely American form of torture.

The delusional premise of Obamacare is that making health care less costly will make it affordable. This has been the Republican fantasy behind health insurance ‘vouchers’ for the last three decades— give everyone a three-hundred dollar tax break to buy insurance and everyone will have health care. Obamacare is set up to give the poorest bottom-half of the country a choice between buying food, paying the rent and ‘buying’ health care. This ‘better than nothing’ approach dissuades people from getting health care until they have no other choice. Decades of experience from actual health care systems suggests that health care— keeping people healthy, is socially and economically less costly than treating people once they are sick. The second-order fantasy at work is that individuals can control health care costs by selecting health insurers that in turn select competent low cost health care providers. The amount of information needed to make the informed choice between policies that might actually accomplish this is beyond the ability of everyone likely to be touched by Obamacare. (Quick: what is the probability that an in network anesthesiologist will be available on any given day? Congratulations, you are one ten-thousandth of your way to making a decision).

Democrat shill and mainstream economist Paul Krugman argues that California, where Medicaid was expanded under Obamacare, points the way toward a single payer health care system. The first problem with this is that Medicaid is a program of minimal health care provision for the very poor— any effort to conflate Medicaid with the functioning health care systems of other ‘developed’ countries is as pathetic as it is disingenuous. The second problem is that for all of the theorized political feasibility of Obamacare half of the states have refused its most important element— Medicaid expansion, meaning that unless these state governments quickly change their minds Mr. Obama’s ‘pragmatic’ compromise looks a lot like what it is widely perceived to be— a cynical sell-out to the sick-care industry. Speculation that there will be a more propitious time to implement real health care reform— single payer, than when Mr. Obama first took office and Democrats held both houses of Congress, derives from the same failed ‘pragmatics’ that now leaves half of the states without Medicaid expansion. The third problem is that unless complexity is resolved people are going to despise Obamacare once they realize that they must devote their lives to insurance company minutiae to get the health care they are now being forced to pay health insurance premiums for. If Mr. Krugman, or any other Democrat Party shill, really wants to sell the idea that ‘Medicaid for all’ is the way forward let them say so clearly so that we know what the Democrat plan really is.

The evidence for the success of Obamacare put forward to date, that a government mandate to make uninsured people buy health insurance has led uninsured people to buy health insurance, illustrates how empty public policy in the U.S. really is. The only relevant metric is whether or not we have a functioning health care system that serves all people, regardless of their ability to pay— or a very clear path in that direction. By this measure, with the most expensive health care system in the ‘developed’ world by a factor of two, substandard outcomes and half of the states rejecting Medicaid expansion, Obamacare has at best added complexity around the margins—it is an entire functioning health care system away from being successful. What will add to this dysfunction, and possibly to a social tipping point, will be the economic effects of the next inevitable financial crisis. As President, Barack Obama has been a continuation of Carter-Reagan-Bush-Clinton-Bush bureaucrats of empire. They have all been successful in serving their constituencies. Maybe it’s ‘progress’ that a black man now has the job. But we aren’t his constituents. Wall Street and the sick-care corporations are. If Democrat apologists want to point to political successes, they should at least point to the right successes.

Rob Urie is an artist and political economist. His book Zen Economics is forthcoming.

 

http://www.counterpunch.org/2014/08/08/obama-and-the-revival-of-wall-street/

Sick of This Market-Driven World? You Should Be



These days, being a deviant from the prevailing wisdom is something to be proud of.

Photo Credit: Zhangyang13576997233/Shutterstock.com

To be at peace with a troubled world: this is not a reasonable aim. It can be achieved only through a disavowal of what surrounds you. To be at peace with yourself within a troubled world: that, by contrast, is an honourable aspiration. This column is for those who feel at odds with life. It calls on you not to be ashamed.

I was prompted to write it by a remarkable book, just published in English, by a Belgian professor of psychoanalysis, Paul Verhaeghe. What About Me? The Struggle for Identity in a Market-Based Society is one of those books that, by making connections between apparently distinct phenomena, permits sudden new insights into what is happening to us and why.

We are social animals, Verhaeghe argues, and our identities are shaped by the norms and values we absorb from other people. Every society defines and shapes its own normality – and its own abnormality – according to dominant narratives, and seeks either to make people comply or to exclude them if they don’t.

Today the dominant narrative is that of market fundamentalism, widely known in Europe as neoliberalism. The story it tells is that the market can resolve almost all social, economic and political problems. The less the state regulates and taxes us, the better off we will be. Public services should be privatised, public spending should be cut, and business should be freed from social control. In countries such as the UK and the US, this story has shaped our norms and values for around 35 years: since Thatcher and Reagan came to power. It is rapidly colonising the rest of the world.

Verhaeghe points out that neoliberalism draws on the ancient Greek idea that our ethics are innate (and governed by a state of nature it calls the market) and on the Christian idea that humankind is inherently selfish and acquisitive. Rather than seeking to suppress these characteristics, neoliberalism celebrates them: it claims that unrestricted competition, driven by self-interest, leads to innovation and economic growth, enhancing the welfare of all.

At the heart of this story is the notion of merit. Untrammelled competition rewards people who have talent, work hard, and innovate. It breaks down hierarchies and creates a world of opportunity and mobility.

The reality is rather different. Even at the beginning of the process, when markets are first deregulated, we do not start with equal opportunities. Some people are a long way down the track before the starting gun is fired. This is how the Russian oligarchs managed to acquire such wealth when the Soviet Union broke up. They weren’t, on the whole, the most talented, hardworking or innovative people, but those with the fewest scruples, the most thugs, and the best contacts – often in the KGB.

Even when outcomes are based on talent and hard work, they don’t stay that way for long. Once the first generation of liberated entrepreneurs has made its money, the initial meritocracy is replaced by a new elite, which insulates its children from competition by inheritance and the best education money can buy. Where market fundamentalism has been most fiercely applied – in countries like the US and UK – social mobility has greatly declined.

If neoliberalism was anything other than a self-serving con, whose gurus and thinktanks were financed from the beginning by some of the world’s richest people (the US multimillionaires Coors, Olin, Scaife, Pew and others), its apostles would have demanded, as a precondition for a society based on merit, that no one should start life with the unfair advantage of inherited wealth or economically determined education. But they never believed in their own doctrine. Enterprise, as a result, quickly gave way to rent.

All this is ignored, and success or failure in the market economy are ascribed solely to the efforts of the individual. The rich are the new righteous; the poor are the new deviants, who have failed both economically and morally and are now classified as social parasites.

The market was meant to emancipate us, offering autonomy and freedom. Instead it has delivered atomisation and loneliness.

The workplace has been overwhelmed by a mad, Kafkaesque infrastructure of assessments, monitoring, measuring, surveillance and audits, centrally directed and rigidly planned, whose purpose is to reward the winners and punish the losers. It destroys autonomy, enterprise, innovation and loyalty, and breeds frustration, envy and fear. Through a magnificent paradox, it has led to the revival of a grand old Soviet tradition known in Russian as tufta. It means falsification of statistics to meet the diktats of unaccountable power.

The same forces afflict those who can’t find work. They must now contend, alongside the other humiliations of unemployment, with a whole new level of snooping and monitoring. All this, Verhaeghe points out, is fundamental to the neoliberal model, which everywhere insists on comparison, evaluation and quantification. We find ourselves technically free but powerless. Whether in work or out of work, we must live by the same rules or perish. All the major political parties promote them, so we have no political power either. In the name of autonomy and freedom we have ended up controlled by a grinding, faceless bureaucracy.

These shifts have been accompanied, Verhaeghe writes, by a spectacular rise in certain psychiatric conditions: self-harm, eating disorders, depression and personality disorders.

Of the personality disorders, the most common are performance anxiety and social phobia: both of which reflect a fear of other people, who are perceived as both evaluators and competitors – the only roles for society that market fundamentalism admits. Depression and loneliness plague us.

The infantilising diktats of the workplace destroy our self-respect. Those who end up at the bottom of the pile are assailed by guilt and shame. The self-attribution fallacy cuts both ways: just as we congratulate ourselves for our success, we blame ourselves for our failure, even if we have little to do with it.

So, if you don’t fit in, if you feel at odds with the world, if your identity is troubled and frayed, if you feel lost and ashamed – it could be because you have retained the human values you were supposed to have discarded. You are a deviant. Be proud.

George Monbiot is the author Heat: How to Stop the Planet from Burning. Read more of his writings at Monbiot.com. This article originally appeared in the Guardian.

http://www.alternet.org/economy/sick-market-driven-world-you-should-be?paging=off&current_page=1#bookmark

Walter Isaacson: “Innovation” doesn’t mean anything anymore

The man who brought America inside the minds of Einstein, Franklin and Jobs takes issue with modern-day tech hype

Walter Isaacson: "Innovation" doesn't mean anything anymore
Walter Isaacson (Credit: Reuters/Fred Prouser)

If anybody in America understands genius, it’s Walter Isaacson.

The bestselling biographer has chronicled the lives of everyone from Benjamin Franklin and Albert Einstein to Henry Kissinger and (most recently) Steve Jobs. In the process, he has garnered a reputation as a writer deeply attuned to the idiosyncratic — and sometimes megalomaniacal — personalities and predilections of singularly brilliant men. But genius alone, as he would probably be the first to point out, actually isn’t enough to change the world.

We live in a time when technology companies — from Google and Apple to the burgeoning start-up community that’s taken Silicon Valley by storm — have staked a place at the center of the American culture. And the idea of innovation, how an idea translates from mind stuff into tangible reality, has consequently become shrouded in a mythology about genius and grit — brainiacs with a golden idea holed up in a dingy garage, working in obscurity before taking the world by storm — that is emotionally appealing but short on nuance. The truth of the matter, as Isaacson has pointed out, is that what makes for a genuine, world-changing innovation is much more complicated than a towering IQ. In reality, execution is everything.

Isaacson’s new book, “The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution,” slated for release in October, explores how disruptive change really comes to fruition. Salon spoke with him earlier this summer about the nature of innovation, and how it’s often misunderstood. This interview has been edited for length and clarity.

First let’s start with something big: What do you think is the greatest issue facing our time?

Right now I think that inequality of opportunity. I think ever since the days of Benjamin Franklin the basic American philosophy – the basic American creed has been if you work hard and play by the rules you can be secure. And we’re losing that these days because of unequal education and because of unequal opportunities and inequalities in wealth.

And I would say that the most important element of that is creating better educational opportunities for all. I think that it used to be … well. I think that every kid should have a decent opportunity for a great education and we don’t have that at the moment.



I wanted to talk to you about how you define innovation because you have written about so many innovators. And the terms “innovation” and “innovator” are sort of the terms of our time. Everything is described in this way.

I think that the word “innovation” has become a buzzword and it’s been drained of much of its meaning because we overuse it.

For the past 12 years I’ve been working on a book about the people who actually invented the computer and the Internet. I put it aside to work on the Steve Jobs biography, but I went back to it, because I wanted to show how real innovators actually get something done.

So instead of trying to be philosophy or simple rules of innovation, I wanted to do it biographically to show you how people you may never have heard of, who invented the computer and Internet, actually came up with their ideas and executed them, because I think when we talk about innovation in the abstract, it loses its meaning.

Innovation comes from collaboration, it comes from teamwork, and it comes from being able to take visionary ideas and actually execute them with good engineering. And there’s no simple buzzword definition of innovation; I think it’s useful, as somebody who loves history, to just focus on real people and how they invented things, and that includes the computer, the Internet, but also the transistor, the microchip, search engines, and the World Wide Web. And there were real people who worked in teams and were able to execute on their ideas and I wanted to tell their stories, based on a dozen years of reporting what they did, instead of some abstract theory of innovation.

I began working on this book 12 years ago, mainly focused on the Internet, but after writing about Steve Jobs and interviewing people who’d been involved with the personal computer, I decided to make it a book about the intersection of digital networks and the personal computer, and how people in that field actually executed on their ideas. I’m not interested in how-to manuals about the philosophy of innovation. I’m interested in real people and how they actually succeeded or failed.

Like Ada Lovelace, who I’ve read that you feature prominently in your new book.

Yeah, Ada was the person who connected art to technology in the 1840s. Her father was the poet Lord Byron, her close friend was Charles Babbage, who invented the analytical engine, and she was able to understand how you could program a mechanical calculator to do more than just numbers. That it could weave patterns just like the punch cards that helped mechanical looms weave fabric, and that’s the first example in my next book of how real people went about connecting the arts and technology to new innovative things.

I start with her, but it goes all the way through the history of people we’ve never paid enough attention to because the people who invented the computer and the Internet were not lone inventors like an Edison or a Bell, in their labs saying, “Eureka!” They were teams of people who worked collaboratively, and so I think sometimes we underestimate … or sometimes we don’t fully appreciate the importance of collaborative creativity. So my book is not a theoretical book, but it’s just a history of the collaborations and teamwork that led to the computer, the Internet, the transistor, the microchip, Wikipedia, Google and other innovations.

In writing this book over the past 12 years, and in your other books on Albert Einstein and Steve Jobs and Ben Franklin, have you noticed any patterns or any similarities? Anything you start to pick up and think, “Oh well that’s very similar between these two, maybe that makes a good innovator”? You’ve mentioned the teams and the collaboration …

Yeah, there’s not one formula, which is fortunate for those of us that write biographies, otherwise you wouldn’t need a lot of biographies. Albert Einstein was much more of a loner, whereas Ben Franklin’s genius was bringing people together into teams. Steve Jobs’ genius was applying creativity and beauty to technology. But the one thing they had in common is they were all imaginative. They all questioned the conventional way of doing things. And as Einstein once said, imagination is more important than knowledge. And that’s sort of been a theme of all of my books.

 

 

http://www.salon.com/2014/08/05/walter_isaacson_innovation_doesnt_mean_anything_anymore/?source=newsletter

Nearly one quarter of US children in poverty

http://epmgaa.media.lionheartdms.com/img/croppedphotos/2013/09/17/child_poverty.jpg

By Andre Damon
23 July 2014

Nearly one in four children in the United States lives in a family below the federal poverty line, according to figures presented in a new report by the Annie E. Casey Foundation.

A total of 16.3 million children live in poverty, and 45 percent of children in the US live in households whose incomes fall below 200 percent of the federal poverty line.

The annual report, titled the Kids Count Data Book, compiles data on children’s economic well-being, education, health, and family support. It concludes that, “inequities among children remain deep and stubbornly persistent.”

The report is an indictment of the state of American society nearly six years after the onset of the financial crisis in 2008. While the Obama administration and the media have proclaimed an economic “recovery,” conditions of life for the vast majority of the population continue to deteriorate.

The report notes that the percentage of children in poverty hit 23 percent in 2012, up sharply from 16 percent in 2000. Some states are much worse. For almost the entire American South, the share of children in poverty is higher than 25 percent.

These conditions are the product of a ruthless class policy pursued at all levels of government. While trillions of dollars have been made available to Wall Street, sending both the stock markets and corporate profits to record highs, economic growth has stagnated, social programs have been slashed, and public services decimated, while prices of many basic items are on the rise. Jobs that have been “created” are overwhelmingly part-time or low-wage.

“We’ve yet to see the recovery from the economic recession,” said Laura Speer, associate director for policy reform and advocacy at the Annie E. Casey Foundation, who helped produce the report. “The child poverty rate is connected to parents’ employment and how much they are getting paid,” added Ms. Speer in a telephone interview Tuesday.

“The jobs that are being created in this economy, including temporary and low-wage jobs, are not good enough to keep children out of poverty,” she added.

The Kids Count report notes, “Declining economic opportunity for parents without a college degree in the context of growing inequality has meant that children’s life chances are increasingly constrained by the socioeconomic status of their parents.” The percentage of children who live in high-poverty communities has likewise increased significantly, with 13 percent of children growing up in communities where more than 30 percent of residents are poor, up from 9 percent in 2000.

Speer added that, given the significant run-up in home prices over the previous two decades, “the housing cost burden has gotten worse.” She noted that the share of children who live in households that spend more than one third of their annual income on housing has hit 38 percent, up from 28 percent in 1990. In states such as California, these figures are significantly higher.

“In many cases families are living doubled up and sleeping on couches to afford very expensive places like New York City,” she added. “Paying such a large share of your income for rent means that parents have to decide between whether or not to pay the rent or to pay the utility bills. It’s not a matter of making choices over things that are luxuries, it’s choosing between necessities.”

The report concludes, “As both poverty and wealth have become more concentrated residentially, evidence suggests that school districts and individual schools are becoming increasingly segregated by socioeconomic status.”

In most of the United States, K-12 education is funded through property taxes, and there are significant differences in education funding based on local income levels. “Kids who grow up in low-income neighborhoods have much less access to education: that’s only been exacerbated over the last 25 years,” Speer said.

The Kids Count survey follows the publication in April of Feeding America’s annual report, which showed that one in five children live in households that do not regularly get enough to eat. The percentage of households that are “food insecure” rose from 11.1 percent in 2007 to 16.0 percent in 2012. Sixteen million children, or 21.6 percent, do not get enough to eat. The rate of food insecurity in the United States is nearly twice that of the European Union.

According to the US government’s supplemental poverty measure, 16.1 percent of the US population—nearly 50 million people—is in poverty, up from 12.2 percent of the population in 2000.

The Kids Count report notes that the ability of single mothers to get a job is particularly sensitive to the state of the economy, and that the employment rate of single mothers with children under 6 years old has fallen from 69 percent in 2000 to 60 percent ten years later. This has taken place even as anti-poverty measures such as Temporary Assistance for Needy Families (TANF) have been made conditional on parents finding work.

The report noted that enrollment in the federal Head Start program, which serves 3- and 4-year-olds dropped off when the “recession decimated state budgets and halted progress.” It added that cutbacks to federal and state anti-poverty programs, as well as health programs such as Medicare and Medicaid, are contributing to the growth of poverty and inequality.

With the “sequester” budget cuts signed by the Obama administration in early 2013, most federal anti-poverty programs are being slashed by five percent each year for a decade. “Programs like head start, LIHEAP [Low Income Home Energy Assistance Program], and other federal programs are really a lifeline in a lot of families,” Speer said.

Since the implementation of the sequester cuts, Congress and the Obama administration have slashed food stamp spending on two separate occasions and put an end to federal extended jobless benefits for more than three million long-term unemployed people and their families. These measures can be expected to throw hundreds of thousands more children into poverty.

The rise of data and the death of politics

Tech pioneers in the US are advocating a new data-based approach to governance – ‘algorithmic regulation’. But if technology provides the answers to society’s problems, what happens to governments?

US president Barack Obama with Facebook founder Mark Zuckerberg

Government by social network? US president Barack Obama with Facebook founder Mark Zuckerberg. Photograph: Mandel Ngan/AFP/Getty Images

On 24 August 1965 Gloria Placente, a 34-year-old resident of Queens, New York, was driving to Orchard Beach in the Bronx. Clad in shorts and sunglasses, the housewife was looking forward to quiet time at the beach. But the moment she crossed the Willis Avenue bridge in her Chevrolet Corvair, Placente was surrounded by a dozen patrolmen. There were also 125 reporters, eager to witness the launch of New York police department’s Operation Corral – an acronym for Computer Oriented Retrieval of Auto Larcenists.

Fifteen months earlier, Placente had driven through a red light and neglected to answer the summons, an offence that Corral was going to punish with a heavy dose of techno-Kafkaesque. It worked as follows: a police car stationed at one end of the bridge radioed the licence plates of oncoming cars to a teletypist miles away, who fed them to a Univac 490 computer, an expensive $500,000 toy ($3.5m in today’s dollars) on loan from the Sperry Rand Corporation. The computer checked the numbers against a database of 110,000 cars that were either stolen or belonged to known offenders. In case of a match the teletypist would alert a second patrol car at the bridge’s other exit. It took, on average, just seven seconds.

Compared with the impressive police gear of today – automatic number plate recognition, CCTV cameras, GPS trackers – Operation Corral looks quaint. And the possibilities for control will only expand. European officials have considered requiring all cars entering the European market to feature a built-in mechanism that allows the police to stop vehicles remotely. Speaking earlier this year, Jim Farley, a senior Ford executive, acknowledged that “we know everyone who breaks the law, we know when you’re doing it. We have GPS in your car, so we know what you’re doing. By the way, we don’t supply that data to anyone.” That last bit didn’t sound very reassuring and Farley retracted his remarks.

As both cars and roads get “smart,” they promise nearly perfect, real-time law enforcement. Instead of waiting for drivers to break the law, authorities can simply prevent the crime. Thus, a 50-mile stretch of the A14 between Felixstowe and Rugby is to be equipped with numerous sensors that would monitor traffic by sending signals to and from mobile phones in moving vehicles. The telecoms watchdog Ofcom envisions that such smart roads connected to a centrally controlled traffic system could automatically impose variable speed limits to smooth the flow of traffic but also direct the cars “along diverted routes to avoid the congestion and even [manage] their speed”.

Other gadgets – from smartphones to smart glasses – promise even more security and safety. In April, Apple patented technology that deploys sensors inside the smartphone to analyse if the car is moving and if the person using the phone is driving; if both conditions are met, it simply blocks the phone’s texting feature. Intel and Ford are working on Project Mobil – a face recognition system that, should it fail to recognise the face of the driver, would not only prevent the car being started but also send the picture to the car’s owner (bad news for teenagers).

The car is emblematic of transformations in many other domains, from smart environments for “ambient assisted living” where carpets and walls detect that someone has fallen, to various masterplans for the smart city, where municipal services dispatch resources only to those areas that need them. Thanks to sensors and internet connectivity, the most banal everyday objects have acquired tremendous power to regulate behaviour. Even public toilets are ripe for sensor-based optimisation: the Safeguard Germ Alarm, a smart soap dispenser developed by Procter & Gamble and used in some public WCs in the Philippines, has sensors monitoring the doors of each stall. Once you leave the stall, the alarm starts ringing – and can only be stopped by a push of the soap-dispensing button.

In this context, Google’s latest plan to push its Android operating system on to smart watches, smart cars, smart thermostats and, one suspects, smart everything, looks rather ominous. In the near future, Google will be the middleman standing between you and your fridge, you and your car, you and your rubbish bin, allowing the National Security Agency to satisfy its data addiction in bulk and via a single window.

This “smartification” of everyday life follows a familiar pattern: there’s primary data – a list of what’s in your smart fridge and your bin – and metadata – a log of how often you open either of these things or when they communicate with one another. Both produce interesting insights: cue smart mattresses – one recent model promises to track respiration and heart rates and how much you move during the night – and smart utensils that provide nutritional advice.

In addition to making our lives more efficient, this smart world also presents us with an exciting political choice. If so much of our everyday behaviour is already captured, analysed and nudged, why stick with unempirical approaches to regulation? Why rely on laws when one has sensors and feedback mechanisms? If policy interventions are to be – to use the buzzwords of the day – “evidence-based” and “results-oriented,” technology is here to help.

This new type of governance has a name: algorithmic regulation. In as much as Silicon Valley has a political programme, this is it. Tim O’Reilly, an influential technology publisher, venture capitalist and ideas man (he is to blame for popularising the term “web 2.0″) has been its most enthusiastic promoter. In a recent essay that lays out his reasoning, O’Reilly makes an intriguing case for the virtues of algorithmic regulation – a case that deserves close scrutiny both for what it promises policymakers and the simplistic assumptions it makes about politics, democracy and power.

To see algorithmic regulation at work, look no further than the spam filter in your email. Instead of confining itself to a narrow definition of spam, the email filter has its users teach it. Even Google can’t write rules to cover all the ingenious innovations of professional spammers. What it can do, though, is teach the system what makes a good rule and spot when it’s time to find another rule for finding a good rule – and so on. An algorithm can do this, but it’s the constant real-time feedback from its users that allows the system to counter threats never envisioned by its designers. And it’s not just spam: your bank uses similar methods to spot credit-card fraud.

In his essay, O’Reilly draws broader philosophical lessons from such technologies, arguing that they work because they rely on “a deep understanding of the desired outcome” (spam is bad!) and periodically check if the algorithms are actually working as expected (are too many legitimate emails ending up marked as spam?).

O’Reilly presents such technologies as novel and unique – we are living through a digital revolution after all – but the principle behind “algorithmic regulation” would be familiar to the founders of cybernetics – a discipline that, even in its name (it means “the science of governance”) hints at its great regulatory ambitions. This principle, which allows the system to maintain its stability by constantly learning and adapting itself to the changing circumstances, is what the British psychiatrist Ross Ashby, one of the founding fathers of cybernetics, called “ultrastability”.

To illustrate it, Ashby designed the homeostat. This clever device consisted of four interconnected RAF bomb control units – mysterious looking black boxes with lots of knobs and switches – that were sensitive to voltage fluctuations. If one unit stopped working properly – say, because of an unexpected external disturbance – the other three would rewire and regroup themselves, compensating for its malfunction and keeping the system’s overall output stable.

Ashby’s homeostat achieved “ultrastability” by always monitoring its internal state and cleverly redeploying its spare resources.

Like the spam filter, it didn’t have to specify all the possible disturbances – only the conditions for how and when it must be updated and redesigned. This is no trivial departure from how the usual technical systems, with their rigid, if-then rules, operate: suddenly, there’s no need to develop procedures for governing every contingency, for – or so one hopes – algorithms and real-time, immediate feedback can do a better job than inflexible rules out of touch with reality.

Algorithmic regulation could certainly make the administration of existing laws more efficient. If it can fight credit-card fraud, why not tax fraud? Italian bureaucrats have experimented with the redditometro, or income meter, a tool for comparing people’s spending patterns – recorded thanks to an arcane Italian law – with their declared income, so that authorities know when you spend more than you earn. Spain has expressed interest in a similar tool.

Such systems, however, are toothless against the real culprits of tax evasion – the super-rich families who profit from various offshoring schemes or simply write outrageous tax exemptions into the law. Algorithmic regulation is perfect for enforcing the austerity agenda while leaving those responsible for the fiscal crisis off the hook. To understand whether such systems are working as expected, we need to modify O’Reilly’s question: for whom are they working? If it’s just the tax-evading plutocrats, the global financial institutions interested in balanced national budgets and the companies developing income-tracking software, then it’s hardly a democratic success.

With his belief that algorithmic regulation is based on “a deep understanding of the desired outcome”, O’Reilly cunningly disconnects the means of doing politics from its ends. But the how of politics is as important as the what of politics – in fact, the former often shapes the latter. Everybody agrees that education, health, and security are all “desired outcomes”, but how do we achieve them? In the past, when we faced the stark political choice of delivering them through the market or the state, the lines of the ideological debate were clear. Today, when the presumed choice is between the digital and the analog or between the dynamic feedback and the static law, that ideological clarity is gone – as if the very choice of how to achieve those “desired outcomes” was apolitical and didn’t force us to choose between different and often incompatible visions of communal living.

By assuming that the utopian world of infinite feedback loops is so efficient that it transcends politics, the proponents of algorithmic regulation fall into the same trap as the technocrats of the past. Yes, these systems are terrifyingly efficient – in the same way that Singapore is terrifyingly efficient (O’Reilly, unsurprisingly, praises Singapore for its embrace of algorithmic regulation). And while Singapore’s leaders might believe that they, too, have transcended politics, it doesn’t mean that their regime cannot be assessed outside the linguistic swamp of efficiency and innovation – by using political, not economic benchmarks.

As Silicon Valley keeps corrupting our language with its endless glorification of disruption and efficiency – concepts at odds with the vocabulary of democracy – our ability to question the “how” of politics is weakened. Silicon Valley’s default answer to the how of politics is what I call solutionism: problems are to be dealt with via apps, sensors, and feedback loops – all provided by startups. Earlier this year Google’s Eric Schmidt even promised that startups would provide the solution to the problem of economic inequality: the latter, it seems, can also be “disrupted”. And where the innovators and the disruptors lead, the bureaucrats follow.

The intelligence services embraced solutionism before other government agencies. Thus, they reduced the topic of terrorism from a subject that had some connection to history and foreign policy to an informational problem of identifying emerging terrorist threats via constant surveillance. They urged citizens to accept that instability is part of the game, that its root causes are neither traceable nor reparable, that the threat can only be pre-empted by out-innovating and out-surveilling the enemy with better communications.

Speaking in Athens last November, the Italian philosopher Giorgio Agamben discussed an epochal transformation in the idea of government, “whereby the traditional hierarchical relation between causes and effects is inverted, so that, instead of governing the causes – a difficult and expensive undertaking – governments simply try to govern the effects”.

Nobel laureate Daniel Kahneman

Governments’ current favourite pyschologist, Daniel Kahneman. Photograph: Richard Saker for the Observer
For Agamben, this shift is emblematic of modernity. It also explains why the liberalisation of the economy can co-exist with the growing proliferation of control – by means of soap dispensers and remotely managed cars – into everyday life. “If government aims for the effects and not the causes, it will be obliged to extend and multiply control. Causes demand to be known, while effects can only be checked and controlled.” Algorithmic regulation is an enactment of this political programme in technological form.The true politics of algorithmic regulation become visible once its logic is applied to the social nets of the welfare state. There are no calls to dismantle them, but citizens are nonetheless encouraged to take responsibility for their own health. Consider how Fred Wilson, an influential US venture capitalist, frames the subject. “Health… is the opposite side of healthcare,” he said at a conference in Paris last December. “It’s what keeps you out of the healthcare system in the first place.” Thus, we are invited to start using self-tracking apps and data-sharing platforms and monitor our vital indicators, symptoms and discrepancies on our own.This goes nicely with recent policy proposals to save troubled public services by encouraging healthier lifestyles. Consider a 2013 report by Westminster council and the Local Government Information Unit, a thinktank, calling for the linking of housing and council benefits to claimants’ visits to the gym – with the help of smartcards. They might not be needed: many smartphones are already tracking how many steps we take every day (Google Now, the company’s virtual assistant, keeps score of such data automatically and periodically presents it to users, nudging them to walk more).

The numerous possibilities that tracking devices offer to health and insurance industries are not lost on O’Reilly. “You know the way that advertising turned out to be the native business model for the internet?” he wondered at a recent conference. “I think that insurance is going to be the native business model for the internet of things.” Things do seem to be heading that way: in June, Microsoft struck a deal with American Family Insurance, the eighth-largest home insurer in the US, in which both companies will fund startups that want to put sensors into smart homes and smart cars for the purposes of “proactive protection”.

An insurance company would gladly subsidise the costs of installing yet another sensor in your house – as long as it can automatically alert the fire department or make front porch lights flash in case your smoke detector goes off. For now, accepting such tracking systems is framed as an extra benefit that can save us some money. But when do we reach a point where not using them is seen as a deviation – or, worse, an act of concealment – that ought to be punished with higher premiums?

Or consider a May 2014 report from 2020health, another thinktank, proposing to extend tax rebates to Britons who give up smoking, stay slim or drink less. “We propose ‘payment by results’, a financial reward for people who become active partners in their health, whereby if you, for example, keep your blood sugar levels down, quit smoking, keep weight off, [or] take on more self-care, there will be a tax rebate or an end-of-year bonus,” they state. Smart gadgets are the natural allies of such schemes: they document the results and can even help achieve them – by constantly nagging us to do what’s expected.

The unstated assumption of most such reports is that the unhealthy are not only a burden to society but that they deserve to be punished (fiscally for now) for failing to be responsible. For what else could possibly explain their health problems but their personal failings? It’s certainly not the power of food companies or class-based differences or various political and economic injustices. One can wear a dozen powerful sensors, own a smart mattress and even do a close daily reading of one’s poop – as some self-tracking aficionados are wont to do – but those injustices would still be nowhere to be seen, for they are not the kind of stuff that can be measured with a sensor. The devil doesn’t wear data. Social injustices are much harder to track than the everyday lives of the individuals whose lives they affect.

In shifting the focus of regulation from reining in institutional and corporate malfeasance to perpetual electronic guidance of individuals, algorithmic regulation offers us a good-old technocratic utopia of politics without politics. Disagreement and conflict, under this model, are seen as unfortunate byproducts of the analog era – to be solved through data collection – and not as inevitable results of economic or ideological conflicts.

However, a politics without politics does not mean a politics without control or administration. As O’Reilly writes in his essay: “New technologies make it possible to reduce the amount of regulation while actually increasing the amount of oversight and production of desirable outcomes.” Thus, it’s a mistake to think that Silicon Valley wants to rid us of government institutions. Its dream state is not the small government of libertarians – a small state, after all, needs neither fancy gadgets nor massive servers to process the data – but the data-obsessed and data-obese state of behavioural economists.

The nudging state is enamoured of feedback technology, for its key founding principle is that while we behave irrationally, our irrationality can be corrected – if only the environment acts upon us, nudging us towards the right option. Unsurprisingly, one of the three lonely references at the end of O’Reilly’s essay is to a 2012 speech entitled “Regulation: Looking Backward, Looking Forward” by Cass Sunstein, the prominent American legal scholar who is the chief theorist of the nudging state.

And while the nudgers have already captured the state by making behavioural psychology the favourite idiom of government bureaucracy –Daniel Kahneman is in, Machiavelli is out – the algorithmic regulation lobby advances in more clandestine ways. They create innocuous non-profit organisations like Code for America which then co-opt the state – under the guise of encouraging talented hackers to tackle civic problems.

Airbnb's homepage.

Airbnb: part of the reputation-driven economy.
Such initiatives aim to reprogramme the state and make it feedback-friendly, crowding out other means of doing politics. For all those tracking apps, algorithms and sensors to work, databases need interoperability – which is what such pseudo-humanitarian organisations, with their ardent belief in open data, demand. And when the government is too slow to move at Silicon Valley’s speed, they simply move inside the government. Thus, Jennifer Pahlka, the founder of Code for America and a protege of O’Reilly, became the deputy chief technology officer of the US government – while pursuing a one-year “innovation fellowship” from the White House.Cash-strapped governments welcome such colonisation by technologists – especially if it helps to identify and clean up datasets that can be profitably sold to companies who need such data for advertising purposes. Recent clashes over the sale of student and health data in the UK are just a precursor of battles to come: after all state assets have been privatised, data is the next target. For O’Reilly, open data is “a key enabler of the measurement revolution”.This “measurement revolution” seeks to quantify the efficiency of various social programmes, as if the rationale behind the social nets that some of them provide was to achieve perfection of delivery. The actual rationale, of course, was to enable a fulfilling life by suppressing certain anxieties, so that citizens can pursue their life projects relatively undisturbed. This vision did spawn a vast bureaucratic apparatus and the critics of the welfare state from the left – most prominently Michel Foucault – were right to question its disciplining inclinations. Nonetheless, neither perfection nor efficiency were the “desired outcome” of this system. Thus, to compare the welfare state with the algorithmic state on those grounds is misleading.

But we can compare their respective visions for human fulfilment – and the role they assign to markets and the state. Silicon Valley’s offer is clear: thanks to ubiquitous feedback loops, we can all become entrepreneurs and take care of our own affairs! As Brian Chesky, the chief executive of Airbnb, told the Atlantic last year, “What happens when everybody is a brand? When everybody has a reputation? Every person can become an entrepreneur.”

Under this vision, we will all code (for America!) in the morning, drive Uber cars in the afternoon, and rent out our kitchens as restaurants – courtesy of Airbnb – in the evening. As O’Reilly writes of Uber and similar companies, “these services ask every passenger to rate their driver (and drivers to rate their passenger). Drivers who provide poor service are eliminated. Reputation does a better job of ensuring a superb customer experience than any amount of government regulation.”

The state behind the “sharing economy” does not wither away; it might be needed to ensure that the reputation accumulated on Uber, Airbnb and other platforms of the “sharing economy” is fully liquid and transferable, creating a world where our every social interaction is recorded and assessed, erasing whatever differences exist between social domains. Someone, somewhere will eventually rate you as a passenger, a house guest, a student, a patient, a customer. Whether this ranking infrastructure will be decentralised, provided by a giant like Google or rest with the state is not yet clear but the overarching objective is: to make reputation into a feedback-friendly social net that could protect the truly responsible citizens from the vicissitudes of deregulation.

Admiring the reputation models of Uber and Airbnb, O’Reilly wants governments to be “adopting them where there are no demonstrable ill effects”. But what counts as an “ill effect” and how to demonstrate it is a key question that belongs to the how of politics that algorithmic regulation wants to suppress. It’s easy to demonstrate “ill effects” if the goal of regulation is efficiency but what if it is something else? Surely, there are some benefits – fewer visits to the psychoanalyst, perhaps – in not having your every social interaction ranked?

The imperative to evaluate and demonstrate “results” and “effects” already presupposes that the goal of policy is the optimisation of efficiency. However, as long as democracy is irreducible to a formula, its composite values will always lose this battle: they are much harder to quantify.

For Silicon Valley, though, the reputation-obsessed algorithmic state of the sharing economy is the new welfare state. If you are honest and hardworking, your online reputation would reflect this, producing a highly personalised social net. It is “ultrastable” in Ashby’s sense: while the welfare state assumes the existence of specific social evils it tries to fight, the algorithmic state makes no such assumptions. The future threats can remain fully unknowable and fully addressable – on the individual level.

Silicon Valley, of course, is not alone in touting such ultrastable individual solutions. Nassim Taleb, in his best-selling 2012 book Antifragile, makes a similar, if more philosophical, plea for maximising our individual resourcefulness and resilience: don’t get one job but many, don’t take on debt, count on your own expertise. It’s all about resilience, risk-taking and, as Taleb puts it, “having skin in the game”. As Julian Reid and Brad Evans write in their new book, Resilient Life: The Art of Living Dangerously, this growing cult of resilience masks a tacit acknowledgement that no collective project could even aspire to tame the proliferating threats to human existence – we can only hope to equip ourselves to tackle them individually. “When policy-makers engage in the discourse of resilience,” write Reid and Evans, “they do so in terms which aim explicitly at preventing humans from conceiving of danger as a phenomenon from which they might seek freedom and even, in contrast, as that to which they must now expose themselves.”

What, then, is the progressive alternative? “The enemy of my enemy is my friend” doesn’t work here: just because Silicon Valley is attacking the welfare state doesn’t mean that progressives should defend it to the very last bullet (or tweet). First, even leftist governments have limited space for fiscal manoeuvres, as the kind of discretionary spending required to modernise the welfare state would never be approved by the global financial markets. And it’s the ratings agencies and bond markets – not the voters – who are in charge today.

Second, the leftist critique of the welfare state has become only more relevant today when the exact borderlines between welfare and security are so blurry. When Google’s Android powers so much of our everyday life, the government’s temptation to govern us through remotely controlled cars and alarm-operated soap dispensers will be all too great. This will expand government’s hold over areas of life previously free from regulation.

With so much data, the government’s favourite argument in fighting terror – if only the citizens knew as much as we do, they too would impose all these legal exceptions – easily extends to other domains, from health to climate change. Consider a recent academic paper that used Google search data to study obesity patterns in the US, finding significant correlation between search keywords and body mass index levels. “Results suggest great promise of the idea of obesity monitoring through real-time Google Trends data”, note the authors, which would be “particularly attractive for government health institutions and private businesses such as insurance companies.”

If Google senses a flu epidemic somewhere, it’s hard to challenge its hunch – we simply lack the infrastructure to process so much data at this scale. Google can be proven wrong after the fact – as has recently been the case with its flu trends data, which was shown to overestimate the number of infections, possibly because of its failure to account for the intense media coverage of flu – but so is the case with most terrorist alerts. It’s the immediate, real-time nature of computer systems that makes them perfect allies of an infinitely expanding and pre-emption‑obsessed state.

Perhaps, the case of Gloria Placente and her failed trip to the beach was not just a historical oddity but an early omen of how real-time computing, combined with ubiquitous communication technologies, would transform the state. One of the few people to have heeded that omen was a little-known American advertising executive called Robert MacBride, who pushed the logic behind Operation Corral to its ultimate conclusions in his unjustly neglected 1967 book, The Automated State.

At the time, America was debating the merits of establishing a national data centre to aggregate various national statistics and make it available to government agencies. MacBride attacked his contemporaries’ inability to see how the state would exploit the metadata accrued as everything was being computerised. Instead of “a large scale, up-to-date Austro-Hungarian empire”, modern computer systems would produce “a bureaucracy of almost celestial capacity” that can “discern and define relationships in a manner which no human bureaucracy could ever hope to do”.

“Whether one bowls on a Sunday or visits a library instead is [of] no consequence since no one checks those things,” he wrote. Not so when computer systems can aggregate data from different domains and spot correlations. “Our individual behaviour in buying and selling an automobile, a house, or a security, in paying our debts and acquiring new ones, and in earning money and being paid, will be noted meticulously and studied exhaustively,” warned MacBride. Thus, a citizen will soon discover that “his choice of magazine subscriptions… can be found to indicate accurately the probability of his maintaining his property or his interest in the education of his children.” This sounds eerily similar to the recent case of a hapless father who found that his daughter was pregnant from a coupon that Target, a retailer, sent to their house. Target’s hunch was based on its analysis of products – for example, unscented lotion – usually bought by other pregnant women.

For MacBride the conclusion was obvious. “Political rights won’t be violated but will resemble those of a small stockholder in a giant enterprise,” he wrote. “The mark of sophistication and savoir-faire in this future will be the grace and flexibility with which one accepts one’s role and makes the most of what it offers.” In other words, since we are all entrepreneurs first – and citizens second, we might as well make the most of it.

What, then, is to be done? Technophobia is no solution. Progressives need technologies that would stick with the spirit, if not the institutional form, of the welfare state, preserving its commitment to creating ideal conditions for human flourishing. Even some ultrastability is welcome. Stability was a laudable goal of the welfare state before it had encountered a trap: in specifying the exact protections that the state was to offer against the excesses of capitalism, it could not easily deflect new, previously unspecified forms of exploitation.

How do we build welfarism that is both decentralised and ultrastable? A form of guaranteed basic income – whereby some welfare services are replaced by direct cash transfers to citizens – fits the two criteria.

Creating the right conditions for the emergence of political communities around causes and issues they deem relevant would be another good step. Full compliance with the principle of ultrastability dictates that such issues cannot be anticipated or dictated from above – by political parties or trade unions – and must be left unspecified.

What can be specified is the kind of communications infrastructure needed to abet this cause: it should be free to use, hard to track, and open to new, subversive uses. Silicon Valley’s existing infrastructure is great for fulfilling the needs of the state, not of self-organising citizens. It can, of course, be redeployed for activist causes – and it often is – but there’s no reason to accept the status quo as either ideal or inevitable.

Why, after all, appropriate what should belong to the people in the first place? While many of the creators of the internet bemoan how low their creature has fallen, their anger is misdirected. The fault is not with that amorphous entity but, first of all, with the absence of robust technology policy on the left – a policy that can counter the pro-innovation, pro-disruption, pro-privatisation agenda of Silicon Valley. In its absence, all these emerging political communities will operate with their wings clipped. Whether the next Occupy Wall Street would be able to occupy anything in a truly smart city remains to be seen: most likely, they would be out-censored and out-droned.

To his credit, MacBride understood all of this in 1967. “Given the resources of modern technology and planning techniques,” he warned, “it is really no great trick to transform even a country like ours into a smoothly running corporation where every detail of life is a mechanical function to be taken care of.” MacBride’s fear is O’Reilly’s master plan: the government, he writes, ought to be modelled on the “lean startup” approach of Silicon Valley, which is “using data to constantly revise and tune its approach to the market”. It’s this very approach that Facebook has recently deployed to maximise user engagement on the site: if showing users more happy stories does the trick, so be it.

Algorithmic regulation, whatever its immediate benefits, will give us a political regime where technology corporations and government bureaucrats call all the shots. The Polish science fiction writer Stanislaw Lem, in a pointed critique of cybernetics published, as it happens, roughly at the same time as The Automated State, put it best: “Society cannot give up the burden of having to decide about its own fate by sacrificing this freedom for the sake of the cybernetic regulator.”