Capitalism Is Doomed — Without Alternatives, So Are We

Published on

‘Though it appears as if rumors of capitalism’s imminent demise have been greatly exaggerated,’ writes Johnson, ‘there is good reason to believe that its remarkable ability to adapt and evolve in the face of frequent (self-induced) shocks has reached a breaking point.’ (Image: OpenClipArt)

In 1946, George Orwell pondered the fragility of the capitalist order.

Reviewing the work of the influential theorist James Burnham, Orwell presaged several concepts that would later form the groundwork for his best-known novel, 1984.

“Not only is the best of capitalism behind us, but the worst of it may lie just ahead.”

In his book The Managerial Revolution, Burnham envisioned, as Orwell put it, “a new kind of planned, centralised society which will be neither capitalist nor, in any accepted sense of the word, democratic. The rulers of this new society will be the people who effectively control the means of production.”

“The real question,” Orwell adds, “is not whether the people who wipe their boots on us during the next fifty years are to be called managers, bureaucrats, or politicians: the question is whether capitalism, now obviously doomed, is to give way to oligarchy or to true democracy.”

While Orwell was wary of Burnham’s worldview and of his more specific predictions, he agreed that the relationship between capitalism and democracy has always been, and always will be, a precarious one.

“For quite fifty years past,” Orwell noted, “the general drift has almost certainly been towards oligarchy.”


Pointing to the concentration of political and economic power in the hands of the few and acknowledging “the weakness of the proletariat against the centralised state,” Orwell was far from optimistic about the future — but he was quite certain that the economic status quo would eventually give way.

Recent events, and the material circumstances of much of the world’s population, have prompted serious examinations of the same questions Orwell was considering seven decades ago. And though it appears as if rumors of capitalism’s imminent demise have been greatly exaggerated, there is good reason to believe that its remarkable ability to adapt and evolve in the face of frequent (self-induced) shocks has reached a breaking point.

Widespread discontent over stagnant incomes and the uneven prosperity brought about by neoliberal globalization has, in 2016, come to a head in striking fashion; Donald Trump, Brexit, and the rise of far-right parties in Europe have many questioning previously sacred assumptions.

“Is the marriage between liberal democracy and global capitalism an enduring one?” asked Martin Wolf, a formidable commentator in one of the world’s leading business papers, the Financial Times.

This was no rhetorical softball; Wolf is genuinely concerned that the winners of globalization have grown complacent, that they have “taken for granted” a couple that was only tenuously compatible to begin with. He also worries, rightly, that they have downplayed the concerns of the “losers.”

Wolf concludes that “if the legitimacy of our democratic political systems is to be maintained, economic policy must be orientated towards promoting the interests of the many not the few; in the first place would be the citizenry, to whom the politicians are accountable.”

Not all members of the commentariat share Wolf’s willingness to engage with these cherished assumptions, however. Indeed, many analysts have reserved their ire not for failing institutions or policies but for the public, reviving Walter Lippmann’s characterization of the masses as a “bewildered herd” that, if left to its own devices, is sure to usher in a regime of chaos.

“It’s time,” declared Foreign Policy‘s James Traub, channeling the sentiments of Josh Barro, “for the elites to rise up against the ignorant masses.”

Apologists like Traub and Barro — just two among many — speak and write as if the leash previously restraining the “herd” has been loosened, and that the resulting freedom has laid bare what elitists have long believed to be the case: To use Barro’s infamous words, “Elites are usually elite for good reason, and tend to have better judgment than the average person.” They point to the rise of Donald Trump as evidence of an intolerable democratic surplus — evidence, in short, of what the masses will do if granted a loud enough voice.

Aside from being conveniently self-serving, this narrative is also false.

Far from loosening the leash, elites have consolidated power to an unprecedented extent, and they have used their influence to undercut democratic movements and hijack public institutions. The resulting concentration of wealth and political power is jarring, and it puts the lie to the farcical notion that elites are a persecuted minority.

But, in the midst of these anti-democratic diatribes, fascinating and important critiques of a rather different nature have emerged.

“Far from loosening the leash, elites have consolidated power to anunprecedented extent, and they have used their influence to undercut democratic movements and hijack public institutions.”

Instead of urging us to align Against Democracy, to use the name of a recent book by the libertarian political philosopher Jason Brennan, many are arguing that it is capitalism, and not the excesses of the democratic process, that has provided figures like Trump a launching pad.

In his book Postcapitalism, Paul Mason argues that the rapid emergence of information technology has corroded the boundaries of the market; “capitalism,” he insists, “has reached the limits of its capacity to adapt.” And its attempts to reach beyond these limits have fostered an economic environment defined by instability, crippling austerity for the many, and rapid accumulation of wealth for the few.

According to Oxfam, the global 1 percent now owns as much wealth as the bottom 99 percent. CEO pay has continued to soar. And though post-crisis reforms have carried soaring promises of stability, the financial sector is still far too large, and many of the banks harmed by the crash they created are back and nearly as powerful as ever.

Mason summarizes: “According to the OECD, growth in the developed world will be ‘weak’ for the next fifty years. Inequality will rise by 40 per cent. Even in the developing countries, the current dynamism will be exhausted by 2060.”

“The OECD’s economists were too polite to say it,” he adds, “so let’s spell it out: for the developed world the best of capitalism is behind us, and for the rest it will be over in our lifetime.”

Sociologist Peter Frase, in his new book Four Futures, implicitly agrees with many of Mason’s key points, but he then takes up the task of looking further ahead, of contemplating possible futures that hinge largely upon how we respond to the crises we are likely to face in the coming years.

For Frase, not only is the best of capitalism behind us, but the worst of it may lie just ahead.

Central to Four Futures are what Frase calls the “[t]wo specters…haunting Earth in the twenty-first century” — “the specters of environmental catastrophe and automation.”

Rather than attempting to predict the future, Frase — guided by Rosa Luxemburg’s famous words, “Bourgeois society stands at the crossroads, either transition to socialism or regression into barbarism” — lays out potential, contingent scenarios. And while Mason’s book exudes optimism about the advancement of information technology and automation, Frase is more cautious.

“To the extent that the rich are able to maintain their power,” Frase writes, “we will live in a world where they enjoy the benefits of automated production, while the rest of us pay the costs of ecological destruction—if we can survive at all.” And, “To the extent that we can move toward a world of greater equality, then the future will be characterized by some combination of shared sacrifice and shared prosperity, depending on where we are on the other, ecological dimension.”

It comes down, in short, to who wins the class struggle. “I am a very old-fashioned Marxist in that way,” Frase remarked in a recent interview.

None of the futures Frase maps out are inevitable, the result of historical forces that are beyond our control. He is contemptuous of those who cling to “secular eschatology”; capitalism’s collapse, he notes, will not likely be the result of a single, revolutionary moment.

In expressing this view he aligns with Wolfgang Streeck, who has argued that capitalism is “a social system in chronic disrepair,” and that while “we cannot know when and how exactly capitalism will disappear and what will succeed it,” we can know that a system that depends on endless growth and the elimination of all restraints will eventually self-destruct.

The disappearance of capitalism, though, as Orwell understood, does not necessarily imply the emergence of an egalitarian society, one in which resources are shared for the benefit of the many. But while few agree on precisely how to establish the framework for such a society, there are, Mason and Frase argue, policies that can move us in the right direction.

Both, for instance, support the idea of a universal basic income, which, in Frase’s words, would “create a situation in which it possible to survive without depending on selling your labor to anyone who will pay for it,” making automation a path to liberation, not destitution. And Mason rightly argues that, in order to avert catastrophic warming, we must radically reduce carbon emissions.

But the usual political obstacles remain, as does the fact that the “winners” are not likely to hand over their gains, or their positions of power and influence, without a fight. We cannot, then, passively rely on amoral forces like technology to bring about the necessary change.

“Technological developments give a context for social transformations,” Frase writes, “but they never determine them directly; change is always mediated by the power struggles between organized masses of people.”


The future is necessarily disobedient; it rarely conforms to even the most meticulous theoretical anticipations, to say nothing of our deepest desires or fears.

But one thing is clear: The future of capitalism and the future of the planet are intertwined. The health of the latter depends on our ability to dismantle the former, and on our ability to construct an alternative that radically alters our course, which is at present leading us toward catastrophe.

“One thing is clear: The future of capitalism and the future of the planet are intertwined.”

Whether the path to which we are ultimately confined is one that leads to a utopian dream or a dystopian nightmare is contingent upon our ability to connect the struggles that currently occupy the left — those fighting for the right to organize are confronting, at bottom, the same forces as those working to prevent the plunder of sacred land.

There are reasons to be both hopeful and pessimistic about the prospects of these struggles.

The campaign of Bernie Sanders, and the movements that emerged before it and alongside it, revealed that there is a large base of support for social democratic changes that, if enacted, would move us in the right direction.

The obstacles, however, are immense, as is the arithmetic: As Bill McKibben has noted, “The future of humanity depends on math,” and the climate math we face is “ominous.”

But, as Noam Chomsky has argued, the debate over the choice between pessimism and optimism is really no debate at all.

“We have two choices,” he concludes. “We can be pessimistic, give up and help ensure that the worst will happen. Or we can be optimistic, grasp the opportunities that surely exist and maybe help make the world a better place. Not much of a choice.”

Jake Johnson is an independent writer. Follow him on Twitter: @wordsofdissent

Clinton: The Silicon Valley Candidate

By refusing to release the transcripts of her paid speeches to Wall Street bankers, Democratic presidential candidate Hillary Clinton cast doubt on her independence from the crooks who run the financial system.  By contrast, Clinton’s program for “technology and innovation policy” has been an open book since June 2016.  What she publicized is as revealing – and as disturbing – as what she tried to keep secret.

Clinton paints her tech agenda in appealing terms.  She says it’s about reducing social and economic inequality, creating good jobs, and bridging the digital divide. The real goals – and beneficiaries – are different.  The document is described as “a love letter to Silicon Valley” by a journalist,[1] and as a “Silicon Valley wish list” by theWashington Post.[2]

On the domestic side, Clinton promises to invest in STEM education and immigration reform to expand the STEM workforce by allowing green cards for foreign workers who’ve earned STEM degrees in the US. The internet industry has been lobbying Congress for years to reform US immigration policy to gain flexibility in hiring, to ease access to a global pool of skilled labor, and to weaken employees’ bargaining power.[3]

Clinton’s blanket endorsement of online education opens new room for an odious private industry.  With buzzwords like “entrepreneurship,” “competitive,” and “bootstrap,” Clinton wants to “leverage technology”: by “delivering high-speed broadband to all Americans” she declares it will be feasible to provide “wrap-around learning for our students in the home and in our schools.”[4] Absent an overt commitment to public education, this is an encouragement to online vendors to renew their attack on the U.S. education system – despite a track record of failure and flagrant corruption. Still more deceitful is Hillary’s lack of acknowledgment of a personal conflict of interest.  According to a Financial Times analysis, after stepping down as Secretary of State in 2013, Hillary accepted hundreds of thousands of dollars for speeches to private education providers; her husband Bill has “earned” something like $21 million from for-profit education companies since 2010.[5]

Clinton’s proposal for access to high-speed Internet for all by 2020 would further relax regulation to help the Internet industry to build new networks, tap into existing public infrastructure, and encourage “public and private” partnerships. These are euphemisms for corporate welfare, after the fashion of the Google fiber project – which is substantially subsidized by taxpayers, as cities lease land to the giant company for its broadband project at far below market value and offer city services for free or below cost.[6] Clinton’s policy program also backs the 5G wireless network initiative and the release of unlicensed spectrum to fuel the “Internet of Thing.” (IoT). 5G wireless and IoT are a solution in search of a problem – unless you are a corporate supplier or a business user of networks.  This is an unacknowledged program to accelerate and expand digital commodification.

Clinton’s international plans are equally manipulative. She will press for “an open Internet abroad,” that is, for “internet freedom” and “free flow of information across borders.” Despite the powerful appeal of this rhetoric, which she exploited systematically when she was Secretary of State, Clinton actually is pushing to bulwark U.S. big capital in general, and U.S. internet and media industries, in particular.  Secretary Clinton’s major speech on Internet freedom[7]in 2010 came mere days after Google’s exit from China, supposedly on grounds of principle, making it plain that the two interventions – one private, one public – were coordinated elements of a single campaign.  Outside the United States, especially since the disclosures by Edward Snowden in 2013, it is increasingly well-understood that the rhetoric of human rights is a smokescreen for furthering U.S. business interests.[8] Reviving this approach is cynical electioneering rather than an endeavor to advance human rights or, indeed, more just international relations.

This in turn provides the context in which to understand Clinton’s vow to support the “multi-stakeholder” approach to Internet governance.  “Multi-stakeholderism” endows private corporations with public responsibilities, while it downgrades the ability of governments to influence Internet policy – as they have tried to do, notably, in the United Nations.  By shifting the domain in this way, the multi-stakeholder model actually reduces the institutional room available to challenge U.S. power over the global Internet.  It was for this very reason that the Obama Administration recently elevated multi-stakeholderism into the reigning principle for global Internet governance:  On 1 October, the U.S. Commerce Department preempted (other) governments from exercising a formal role.

This is, once again, the preferred agenda of Silicon Valley.[9] Aaron Cooper, vice president of strategic initiatives for the Software Alliance, a Washington trade group representing software developers, crowed in a Washington Post interview, “A lot of the proposals that are in the Clinton initiative are consistent with the broad themes that [we] and other tech associations have been talking about, so we’re very pleased.”[10]

To build up her policy platform in this vital field, Clinton has assembled a network of more than 100 tech and telecom advisors.[11] The members of this shadowy group have not been named, but they are said to include former advisors and officials, affiliates of think-tanks and trade groups, and executives at media corporations.  Apparently, just as with respect to Wall Street, the public has no right to know who is shaping Clinton’s program for technology.  Equally clearly, however, it is meant to resonate with Apple’s Tim Cook, Tesla CEO Elon Musk, and Facebook co-founder Dustin Moskovitz – all of whom have publicly rallied to her campaign.[12]

Some might choose to emphasize that the Republican candidate, Donald Trump, has not even bothered to hint to voters about his tech and information policy. Fair enough. Clinton’s program, though, is both surreptitious and plutocratic. It’s not that she’s not good enough – it’s that she’s in the wrong camp.  England’s Labour Party leader Jeremy Corbyn’s “Digital Democracy” program offers a better entry point for thinking about democratic information policy, as it includes publicly financed universal internet access, fair wages for cultural workers, release to open source of publicly funded software and hardware, cooperative ownership of digital platforms and more.  That would be a start.


[1] Noah Kulwin, “Hillary Clinton’s tech policy proposal sounds like a love letter to Silicon Valley,” recode, June 28, 2016.

[2] Brian Fung, “Hillary Clinton’s tech agenda is really a huge economic plan in disguise,Washington Post, June 28, 2016.

[3] Schiller, D. & Yeo. S. (Forthcoming, Fall 2016) Science and Engineering Into Digital Capitalism, in Tyfield, D., Lave, R., Randalls, S., and Thorpe, C. (Eds.), Routledge Handbook of the Political Economy of Science.

[4] “Hillary Clinton’s Initiative on Technology and Innovation,” The Briefing, June 27, 2016.

[5] Gary Silverman, “Hillary and Bill Clinton: The For-Profit Partnership,” Financial Times, July 21, 2016.

[6] Kenric Ward, “Taxpayers subsidize Google Fiber in this city with bargain land leases,”, August 16, 2016; Timothy B. Lee,”How Kansas City taxpayers support Google Fiber,” arstechnica, September 7, 2012.

[7] Hillary Rodham Clinton, Secretary of State, “Remarks on Internet Freedom,” January 21, 2010, The Newseum, Washington, DC.

[8] Dan Schiller, Digital Depression.  Urbana: University of Illinois Press, 2014: 161-69.

[9] Heather Greenfield, “CCIA Applauds Hillary Clinton’s Tech Agenda,” Computer & Communications Industry Association, June 28, 2016.

[10] Brian Fung, “Hillary Clinton’s tech agenda is really a huge economic plan in disguise,” Washington Post, June 28, 2016.

[11] Margaret Harding McGill & Nancy Scola, “Clinton quietly amasses tech policy corps,” Politico, August 24, 2016; Steven Levy, “How Hillary Clinton Adopted the Wonkiest Tech Policy Ever,” Backchannel, August 29, 2016 ; Tony Romm, “Inside Clinton’s tech policy circle,”Politico, June 7, 2016.

[12]Sen. Hilary Clinton,; Levy Sumagaysay, “Facebook co-founder pledges $20 million to help Hillary Clinton defeat Donald Trump,” The Mercury News, September 9, 2016;  Russell Brandom, “Tim Cook is hosting a fundraiser for Hillary Clinton,Verge, July 29, 2016.

This article originally appeared on Information Observatory.

Dan Schiller is a historian of information and communications at the University of Illinois. His most recent book is Digital Depression: Information Technology and Economic Crisis Shinjoung Yeo is an assistant prof at Loughborough University in London.

The Silicon Valley Candidate

Werner Herzog’s Lo and Behold: Reveries of The Connected World

Exploring the origins and impact of the Internet

By Kevin Reed
8 October 2016

German filmmaker Werner Herzog’s new documentary Lo and Behold: Reveries of The Connected World was released in August at select theatres across the US and for home viewing from various on-demand services. The movie—which examines the origins and implications of the Internet and related technologies such as artificial intelligence, robotics, the Internet of Things and space travel—has received generally favorable reviews following its premiere at the Sundance Film Festival in late January.

Lo and Behold

The work is divided into ten segments with titles like “The Early Days,” “The Glory of the Net” and “The Future,” with Herzog serving as narrator. Through a series of interviews, the director stitches his disparate topics together to explain something about how the Internet and World Wide Web were created and then to paint a troubling picture of the globally interconnected landscape.

The movie begins with a visit to the campus of the University of California, Los Angeles (UCLA), the birthplace—along with the Stanford Research Institute—of the Internet. The first interviewee is Leonard Kleinrock, one of the research scientists responsible for the development of the precursor of the Internet called ARPANET (Advanced Research Projects Agency Network of the US Defense Department). At age 82, Kleinrock is obviously thrilled at the opportunity to describe how the first-ever electronic message was transmitted between two points on the network.

As he opens a cabinet of early Internet hardware called a “packet switch,” Kleinrock describes in detail the events of October 29, 1969 at 10:30 pm. As the UCLA sender began typing the word “login”—and checking by telephone with his counterpart at Stanford University—only the first two characters of the message were successfully transmitted before his computer crashed. Despite this seemingly failed communication attempt, Kleinrock explains that “Lo” was an entirely appropriate word for the accomplishment. “It was from here,” he says, “that a revolution began.”

With Herzog occasionally interjecting off-camera during the interviews, the director’s goal seems clear enough. He wants the audience to share his sense of wonder and amazement at the transformative impact of the Internet. This is reinforced by equally intriguing interviews with several others who participated in the birth of the Net. The enthusiasm—and clarity on complex topics—expressed by these pioneers leaves one with a desire to hear more of their stories of discovery and progress.

As the film goes on, however, it emerges that Herzog has another plan; he abandons any historically logical accounting of the Internet and begins eclectically focusing on its various byproducts and offshoots, limitations and negative consequences. Herzog’s interview with Ted Nelson—a philosopher and sociologist credited with theoretically anticipating the World Wide Web and coining the terms “hypertext” and “hypermedia”—becomes the starting point for these wanderings.

Werner Herzog in 2007 (Photo: Erinc Salor)

As a student at Harvard University, Ted Nelson began working in 1960 on a computer system called Project Xanadu that he conceived of as “a digital repository scheme for world-wide electronic publishing.” Nelson also wrote an important book in 1974 entitled Computer Lib/Dream Machines, a kind of manifesto for hobbyists on the social and revolutionary implications of the personal computer.

Although it is left unexplained in the film, the Internet is the technical infrastructure upon which the World Wide Web was developed beginning in 1989. Ever since the widespread adoption of the World Wide Web, Nelson has been a public critic of its structure and implementation, especially HTML (Hypertext Markup Language). He has called HTML a gross oversimplification of his pioneering ideas and said that it “trivializes our original hypertext model with one-way, ever-breaking links and no management of version or contents.”

Why is it that HTML and the World Wide Web emerged as the dominant graphical layer of the Internet as opposed to a competing set of ideas? Is it possible that a solution more comprehensive, expressing more completely the potential of the technology and more effective and useful could have been adopted instead?

One aspect of the rapid global adoption of the World Wide Web—originally created by Tim Berners-Lee in 1989 at CERN in Switzerland—was the open access policy of its inventor. As Berners-Lee, who is also interviewed in the film, has explained, “Had the technology been proprietary, and in my total control, it would probably not have taken off. You can’t propose that something be a universal space and at the same time keep control of it.” However, while the non-proprietary nature of Berners-Lee’s creation was a significant factor in its success, it does not automatically follow that the core technology of the World Wide Web represented an advance over the ideas represented by others such as Ted Nelson.

These are important and complex questions that have been repeated again and again in the evolution of the information revolution of the past half-century, the further exploration of which would point to fundamental problems of modern technology, i.e. the contradiction between “what is possible” versus “what is required” within the economic and political framework of global capitalist society.

Showing little interest in exploring these matters more deeply, Lo and Behold goes on to present Nelson—a gifted but socially awkward man—as something of a high-tech Don Quixote. Herzog concludes the interview with the quip, “To us you appear to be the only one around who is clinically sane.”

Lo and Behold

Having made nearly forty documentaries in his five-decade career, Herzog is accomplished at gaining access to people with compelling stories to tell. The interview with Elon Musk, founder of Tesla Motors and SpaceX, raises important points. A consistently outspoken opponent of artificial intelligence, Musk makes the following warning: “[I]f you were running a hedge fund or private equity fund and all I want my AI to do is maximize the value of my portfolio, then AI could decide to short consumer stocks, go long on defense stocks, and start a war. Ah, and that obviously would be quite bad.”

This possible scenario under capitalism is not explored any further. While the US military is never specifically mentioned, it is remarkable that the only reference to war in the course of a 98-minute critical look at modern technology comes from a billionaire entrepreneur. Above all, Musk’s comments show that the new technologies by themselves bring no fundamental change to the class relations within capitalist society; indeed the Internet and artificial intelligence in the hands of the ruling elite enable a further and accelerated integration of financial parasitism and imperialist war.

Given that Lo and Behold is sponsored by Netscout Systems, a major corporate supplier of networking hardware and software, it is possible that such topics were off limits. However, the lack of a broader or coherent critical perspective is not something new for Werner Herzog.

While he made some interesting and disturbing fiction films in the 1970s (The Enigma of Kaspar Hauser, Aguirre: The Wrath of God and Stroszek in particular), the end of the period of radicalization had an impact on Herzog, as it did on other New German Cinema directors like R. W. Fassbinder, Wim Wenders and Volker Schlöndorff. There was always an overwrought element in Herzog’s work and an emphasis on physical or spiritual excess, without much reference to the content of the action.

In media interviews about his latest film, Herzog has been careful to explain that he does not blame technology itself for the aberrations depicted. “The Internet is not good or evil, dark or light hearted,” he says, “it is human beings” that are the problem. Following the advice of experts, Herzog suggests that people need some kind of “filter” to help them use the technology appropriately.

Leaving things so very much at the level of the individual does not begin to get at the source of the contradiction between the positive and destructive potential of modern technology. This contradiction, so clearly demonstrated during World War II with nuclear technology, is itself an expression of the alternatives facing mankind of socialism versus barbarism.

Lack of an understanding about—or refusal to acknowledge—the deeper social and class interests embedded in the forms of human technology leads to only two possible conclusions: (1) the utopian idea that technology develops automatically without wars and crisis toward the improvement of mankind, or (2) the dystopian belief that technological advancement always develops without any hope of revolutionary transformation of society in the direction of an existential threat to humanity. While Herzog and his producers believe they have provided a balanced perspective between these two, in the end, Lo and Behold comes down on the latter side.


Robert Reich: Why it’s time to start considering a universal basic income

As the job market will contract, there’s only one solution to mass unemployment


Robert Reich: Why it's time to start considering a universal basic income
FILE – In this Thursday, Oct. 25, 2012, file photo, a sign attracts job-seekers during a job fair at the Marriott Hotel in Colonie, N.Y. (Credit: AP)

This originally appeared on Robert Reich’s blog.

Imagine a little gadget called an i-Everything. You can’t get it yet, but if technology keeps moving as fast as it is now, the i-Everything will be with us before you know it.

A combination of intelligent computing, 3-D manufacturing, big data crunching, and advanced bio-technology, this little machine will be able to do everything you want and give you everything you need.

There’s only one hitch. As the economy is now organized, no one will be able to buy it, because there won’t be any paying jobs left. You see, the i-Everything will do … everything.

We’re heading toward the i-Everything far quicker than most people realize. Even now, we’re producing more and more with fewer and fewer people.

Internet sales are on the way to replacing millions of retail workers. Diagnostic apps will be replacing hundreds of thousands of health-care workers. Self-driving cars and trucks will replace 5 million drivers.

Researchers estimate that almost half of all U.S. jobs are at risk of being automated in the next two decades.

This isn’t necessarily bad. The economy we’re heading toward could offer millions of people more free time to do what they want to do instead of what they have to do to earn a living.

But to make this work, we’ll have to figure out some way to recirculate the money from the handful of people who design and own i-Everythings, to the rest of us who will want to buy i-Everythings.

One answer: A universal basic income – possibly financed out of the profits going to such labor replacing innovations, or perhaps even a revenue stream off of the underlying intellectual property.

The idea of a universal basic income historically isn’t as radical as it may sound. It’s had support from people on both the left and the right. In the 1970s, President Nixon proposed a similar concept for the United States, and it even passed the House of Representatives.

The idea is getting some traction again, partly because of the speed of technological change. I keep running into executives of high-tech companies who tell me a universal basic income is inevitable, eventually.

Some conservatives believe it’s superior or other kinds of public assistance because a universal basic income doesn’t tell people what to spend the assistance on, and doesn’t stigmatize recipients because everyone qualifies.

In recent years, evidence has shown that giving people cash as a way to address poverty actually works. In study after study, people don’t stop working and they don’t drink it away.

Interest in a basic income is surging, with governments debating it from Finland to Canada to Switzerland to Namibia. The charity “Give Directly” is about to launch a basic income pilot in Kenya, providing an income for more than 10 years to some of the poorest and most vulnerable families on the planet. And then rigorously evaluate the results.

As new technologies replace work, the question for the future is how best to provide economic security for all.

A universal basic income will almost certainly be part of the answer.

Robert Reich, one of the nation’s leading experts on work and the economy, is Chancellor’s Professor of Public Policy at the Goldman School of Public Policy at the University of California at Berkeley. He has served in three national administrations, most recently as secretary of labor under President Bill Clinton. Time Magazine has named him one of the ten most effective cabinet secretaries of the last century. He has written 13 books, including his latest best-seller, “Aftershock: The Next Economy and America’s Future;” “The Work of Nations,” which has been translated into 22 languages; and his newest, an e-book, “Beyond Outrage.” His syndicated columns, television appearances, and public radio commentaries reach millions of people each week. He is also a founding editor of the American Prospect magazine, and Chairman of the citizen’s group Common Cause. His new movie “Inequality for All” is in Theaters. His widely-read blog can be found

The Ugly Truth Behind Apple’s New iPhone 7

Posted on Sep 20, 2016

New product releases from Apple often are a time for analysis, comparison and celebration. But the arrival of the iPhone 7 has brought unwanted attention to the company’s darker side of globalization, oppression and greed.

In a report from The Guardian, Aditya Chakrabortty says that Apple oppresses Chinese workers, does not pay its fair share of taxes and deprives Americans of high-paying jobs while making enormous profits.

Apple’s iPhones are assembled at three firms in China: Foxconn, Wistron and Pegatron. While Apple CEO Tim Cook says the company cares about all its workers—calling any words to the contrary are “patently false and offensive”—the facts on the ground show the opposite.

In 2010, Foxconn employees were killing themselves in high numbers—an estimated 18 attempted suicide and 14 of them died. The company responded by putting up suicide-prevention netting to catch them before their deaths. Apple vowed to improve worker conditions at the plant, yet in August, after reports surfaced that changes in overtime policies caused great stress among workers, two employees killed themselves.

At the Wistron factory, a Danish human-rights organization found it forces thousands of students to work the same hours as adults, for less pay. Students were told they were required to work if they wanted to receive their diplomas. Using young people to work is not a new revelation about Apple. In 2010, the company admitted that 15-year-old children were working in factories supplying Apple products. At a plant run by Wintek in Suzhou, China, workers reportedly were being poisoned by n-Hexane, a toxic chemical that causes muscular atrophy and blurred eyesight.

At Pegatron—the other iPhone assembler—U.S.-based China Labor Watch found staff members work 12-hour days, six days a week. They are forced to work overtime, and 1½ hours are unpaid.  One researcher working there had to stand during his entire 10½-hour shift. When the local government raised the minimum wage, Pegatron cut subsidies for medical insurance.

The Guardian reports:

While iPhone workers for Pegatron saw their hourly pay drop to just $1.60 an hour, Apple remained the most profitable big company in America, pulling in over $47bn in profit in 2015 alone.

What does this add up to? At $231bn, Apple has a bigger cash pile than the US government, but apparently won’t spend even a sliver on improving conditions for those who actually make its money. Nor will it make those iPhones in America, which would create jobs and still leave it as the most profitable smartphone in the world.

It would rather accrue more profits, to go to those who hold Apple stock—such as company boss Tim Cook, whose hoard of company shares is worth $785m. Friends of Cook point to his philanthropy, but while he’s happy to spend on pet projects, he rejects a €13bn tax bill from the EU  as “political crap”—while boasting about how he won’t bring Apple’s billions back to the US “until there’s a fair rate … . It doesn’t go that the more you pay, the more patriotic you are.” The tech oligarch seems to think he knows better than 300 million Americans what tax rates their elected government should set.

When the historians of globalisation ask why it died, they will surely find that companies such as Apple form a large part of the answer. Faced with a binary choice between an economic model that lavishly rewarded a few and a populism that makes lavish promises to many, between Cook on the one hand and [Nigel] Farage on the other, the voters went for the one who at least didn’t bang on about “courage”.

According to a new report from Global Justice Now, a group based in the United Kingdom, 69 of the top 100 economies in the world are corporate entities (an increase from 63 a year ago). Apple is one of those corporate entities. With $234 billion in revenue in 2015, Apple is the ninth-largest company in the world and is wealthier than most countries.

I Came to San Francisco to Change My Life: I Found a Tribe of Depressed Workaholics Living on Top of One Another

Hacker House Blues: my life with 12 programmers, 2 rooms and one 21st-century dream.
By David Garczynski / Salon September 18, 2016

I might have been trespassing up there, but I would often go to the 19th-floor business lounge to work and study. Located on the top floor of the a luxury high-rise in the SOMA district of San Francisco, the lounge was only accessible to residents of the building. Yet for a while I found myself there almost every day.

Seventeen floors below, I lived in an illegal Airbnb with 12 roommates split between two rooms. There were six people packed into my bedroom alone — seven, if you included the guy who lived in the closet. Three bunk beds adorned the walls, and I was fortunate enough to score a bottom bunk. Unfortunately, though, it was not the one by the window, which, with the exception of one dim lamp, was the only source of light in the room. Even at midday, the room never lit up much more than a shadowed cave. At most hours of the day, you could find someone sleeping in there. Getting in and out of bed was a precarious dance in the darkness to avoid stepping into the suitcases on the floor, out of which most of us lived.

In the shared kitchen, the sink more often than not held a giant pile of dishes, and the fridge, packed with everyone’s groceries and leftovers, emanated a slightly moldy aroma. Mixed in there were the half-eaten meals and unfinished condiment jars of tenants who had long since moved out — all left to rot, but often too far buried in the mass of food to be located.

Let’s just say the room was not as advertised.

The Airbnb posting did boast of access to a 24-hour gym, roof deck and bocce courts. The building has an indoor basketball court, an outdoor hot tub and even a rock climbing wall. The 19th-floor business lounge alone comes with a pool table, a porch, several flat-screen TVs and an enviable view of much of San Francisco. For $1,200 a month, it all seemed worth it. The post did say it was a four-person apartment, not 13, and included a picture of a sunny room with a pair of bunk beds, but I figured for a short sub-lease while I attended coding school, it wouldn’t be so bad. The reviews, after all, were pretty positive, too: mostly 5-stars. However, none of them mentioned the fact that I wouldn’t even be given a front door key.

I’d have to sneak into the building every night. The only way I entered the building was by waiting until someone exited or entered, and then I’d slip through the door before it closed. From there I’d walk straight past the front desk guard and head to the bank of elevators. Despite my nerves, that part was surprisingly easy. The building caters to the young tech elite, so a backwards hat and a collegiate T-shirt practically made me invisible. When I got to my floor, I’d make sure none of the neighbors were watching, and if no one was around, I’d stand on my tiptoes and grab the communal key hidden atop the exit sign. Once the door was unlocked, I’d return the key to its perch for the next tenant to use.

I had moved to San Francisco to break into the tech world after being accepted into one of those ubiquitous 12-week coding boot camps. I had dreams of becoming a programmer, hoping one day I could land a remote contracting gig — a job where I could work from wherever and make a good living. My life would be part ski bum and part professional.

In my mid-20s uncertainty, the coding route seemed to have the most promise — high paychecks in companies that prized work-life balance, or so it seemed from afar. I knew the road wouldn’t be easy, but any time I’d mention my ambitions to family and friends, they responded with resounding positivity, affirming my belief that it was a well-worn path to an obtainable goal.

All of the people in that Airbnb were programmers. Some were trying to break into the industry through boot camps, but most were already full-time professional coders. They headed out early in the morning to their jobs at start-ups in the neighborhood. A lot of them hailed from some of the top schools in the country: Stanford, MIT, Dartmouth. If I was going to get through my program, I needed to rely on them, academically and emotionally. Once the program started up, I would find myself coding 15 hours a day during the week, with that number mercifully dropping to 10-12 hours on the weekends. Late at night, when my stressed-out thoughts would form an ever-intensifying feedback loop of questioning despair — What am I doing? Is this really worth it? — I would need to be able to look to the people around me as living reminders of the possibility of my goals.

Every night, the people whose jobs I coveted would come home from 10- to 12-hour shifts in front of a computer and proceed to the couch, where they’d open up their laptops and spend the remaining hours of the night in silence, sifting through more and more lines of code. Beyond preternatural math abilities and a penchant for problem solving, it seemed most didn’t have much in the way of life skills. They weren’t who I thought they would be — a community of intelligent and inspiring men and women bouncing ideas back and forth. Rather they were boys and girls, coddled by day in the security of companies that fed them, entertained them and nursed them. At home, they could barely take care of themselves.

Take for example the programmer who lived in my closet: Every night he’d come home around 9 p.m. He’d sit on the couch, pour himself a bowl of cereal and eat in silence. Then he would grab his laptop and head directly into the closet — a so-called “private room” listed on Airbnb for $1,400 a month. It was the only time I’d ever see him. The only way I could tell he was home was by the glow of his laptop seeping out from under the closet door. Hours later, deep into the night, the light would go out, and I would know he had to gone to sleep. By the time I arrived, he had been living there for 16 months, in a windowless closet with a thin mattress placed right on the floor. During the day he codes for Pinterest. Yeah . . . that Pinterest.

Maybe there were people working in this city who were living out the tech dreams of everyone else, but I’ve realized the number of people who dream about it far outnumber of people who obtain it. Everyone I spoke to in this town seemed doe-eyed about the future, even while they were living in illegal Airbnbs and working at failing startups across the city.

The odds weren’t in my favor. Most likely I’d find myself in the 92 percent of start-ups that go under in three years, trapped like some of my friends — much smarter and better programmers than I’ll ever be — bouncing from failing company to failing company.
Or maybe not. Maybe I would make it, only to become like my friends who earn six-figure paychecks and still lament that they’ll never be able to buy a home here. What illusions could I continue to maintain then?

There was a good chance I’d find myself in a situation like another roommate’s. During salary negotiations for a job at a start-up, he was encouraged to accept the pay tier with a lower salary but higher equity stake. Now he works 12-hour days just to try to keep the company (and his potential payout) afloat on a paycheck not much higher than some entry-level, non-programming jobs.

The most likely scenario, however, was that I’d become like the mid-30s man who slept in the bunk above me. The reality of his situation slowly slipped him into a depressive state, until he was sleeping most hours of the day. The rest of his waking hours were spent walking around slumped and gloomy.

Programming for me was never supposed to be more than a means to an end, but that end started to feel farther and farther away. The longer I lived in that Airbnb, the longer I realized my dreams would never be met. In all likelihood I would be swept up in an economy here that traded on hopes and dreams of the people clamoring to break in. The illegal Airbnbs that dot the city can afford to charge their amounts because there is no shortage of people wanting to break in. There is another smart kid around the corner who believes that despite the working and living conditions this is just the first step to striking it big. Never tell them the odds.

I had hinged my happiness on an illusion and naively fought to get into a community that wouldn’t help me advance in the direction of my dreams. Maybe in the end I would get everything I needed or at least a nice paycheck, but I’d lose all of myself in the process. I’d be churned and beaten by the underbelly of the tech world here long before I could ever make it out.

If you are interested, it’s not that hard to sneak up to the 19th-floor lounge. I still do sometimes, despite having long since moved out and given up programming. From up there the view of San Francisco takes on the artificial quality of a miniature model. To the north, you’ll see a sea of tech start-ups, their signs and symbols a wild mash of colors. From this distance, it can all look so peaceful. Just know that somewhere in that view is another “hacker house” with bright kids living in almost migrant-worker conditions. Somewhere out there is a coding boot camp with slightly inflated numbers, selling a dream. Their fluorescent halls and cramped bedrooms are filled with the perennially hopeful looking to take the place of those who have already realized this dream isn’t all it’s cracked up to be.

It is a beautiful view, though. Just one I no longer want for myself.

David Garczynski has lived in the Bay Area for one year now. In that time, he’s lived in an illegal Airbnb, on his cousin’s couch, in two short-term subleases, and has been evicted once. He just signed an official (and legal) lease last week.

This Election Is Hillary Clinton’s to Lose, and She’s Screwing It Up

 Hillary Clinton raised $143 million in August—and still finds herself in a tight race with Donald Trump. (Photo: Andrew Harnik / AP)

You could not pick a worse, more inept, inexperienced or offensive joke of a presidential candidate than Donald Trump. The United States has become the butt of international ridicule over our very own “Kim Jong-Un.” Any candidate running against Trump from the opposing major party with a pulse ought to be beating him in the polls by double digits. But Hillary Clinton isn’t.

The Democratic nominee is barely ahead of “the most unpopular presidential candidate since the former head of the Ku Klux Klan,” and a recent CNN poll puts her at 2 percent behind Trump. Granted, it is only one poll, and several other recent polls have found her a few percentage points ahead. Still, no Democrat could ask for an easier Republican candidate to beat. In the history of American presidential races, it is likely we have never had a more comically unsuitable figure as Trump nominated by a major party. And yet Clinton is struggling to come out ahead.

The Democrat’s ardent supporters—those who have championed her from Day One—claim that we live in a sexist country and that her gender is what is standing in the way of most Americans embracing her. They assert that the media and her critics hold her to an unfairly high standard. But in a country where white women have benefited far more from affirmative action policies, how is it that we easily elected the nation’s first black president twice, only to stumble over a white female nominee?

The problem is not her gender. Of course, many might refuse to vote for a woman (as I’m sure racist Americans refused to vote for Obama simply because he is black), but many more might vote for her because she is a woman. While there is no way to be certain that the two forces cancel each other out, Clinton’s gender is not her biggest liability. Her refusal to even attempt to embrace bold progressive values and her inability to read the simmering nationwide anger over economic and racial injustice are the larger obstacles to her popularity.

In positioning herself first and foremost as what she is not—Trump—Clinton is picking only the low-hanging fruit. My 9-year-old son could make fun of Trump in clever ways, and does so routinely. For Clinton to fixate on Trump’s endless flaws suggests that her own platform has little substance. For example, in a recent speech she said of Trump, “He says he has a secret plan to defeat ISIS. The secret is, he has no plan.” While these kinds of statements might make for funny one-liners, Clinton’s main credential is that she once led the State Department, and she did so with such hawkishness that Americans who are weary of endless wars are not impressed by the experience. (Not to mention that she was caught telling lies about her private email server while secretary of state.) If she proposed diplomacy over war, a plan to exit Iraq/Afghanistan/Syria, a promise to withhold weapons from Saudi Arabia, a commitment to Palestinian human rights, and so on, voters might sit up and take note.

On domestic issues, Clinton is failing to articulate a progressive vision as well. A recently leaked memo revealed that the Democratic Party views Black Lives Matter as a “radical” movement and should not “offer support for concrete policy positions.” Troy Perry, who wrote the memo, now is part of Clinton’s campaign team. Rather than quickly rebutting the memo and affirming her full support for the movement, Clinton has remained silent. Meanwhile, BLM issued a pointed response, saying, “We deserve to be heard, not handled.”

Black voters tend to vote Democratic—a fact the party has taken for granted for decades. But if Clinton wants to earn those votes, she could take a page out of Green Party presidential candidate Jill Stein’s book and visit (or send a representative to visit) the ongoing occupation of Los Angeles City Hall by Black Lives Matter activists. BLM is calling on Mayor Eric Garcetti to fire Los Angeles Police Chief Charlie Beck over a spate of killings by officers that has made his department the most violent of all departments nationwide. Instead, Clinton goes to Beverly Hills for a fundraiser to hobnob with wealthy donors and celebrities, including Garcetti.

Clinton has also failed to offer bold thinking on the hot-button issue of immigration. Trump, in a recent controversial visit to Mexico, reiterated his bizarre plan to build a border wall and have our southern neighbor pay for it. Though Clinton stands nowhere near such a plan—and does not embrace Trump’s announcement to ban Muslims—she does share with him some troubling aspects of an inhumane, enforcement-heavy approach to immigration, including the use of biometric data to track the undocumented. She has gone on record as saying, “I voted numerous times when I was a senator to spend money to build a barrier to try to prevent illegal immigrants from coming in. … And I do think you have to control your borders.” She also voted for a 2006 bill that called for a fence on the U.S.-Mexico border that’s only a few hundred miles shorter than what Trump is proposing. If Clinton wanted to excite her Latino base, she could take a far bolder stance, admitting that she was wrong on her earlier positions and offering a humane vision, more in line with the one Bernie Sanders articulated that won him broad support.

Rather than reaching out to American voters on such issues, Clinton has been busy pandering to one particular community: the uber-rich. According to a New York Times article, she has made multiple trips to wealthy enclaves over the past month alone. In addition to Beverly Hills, she has visited Martha’s Vineyard and the Hamptons, rubbing elbows with celebrities and other rich elites. Just in August she raised more than $140 million through such fundraisers—easy fodder for the GOP to criticize in a new set of ads.

While making herself accessible to America’s upper classes, she has made herself almost completely unavailable to the press. Until Thursday, Clinton had not held a single news conference in 2016, inviting the unflattering comparison to President George W. Bush, who came under fire for avoiding interactions with the media. Bush was skewered for acting like he was hiding something, afraid the press might ask hard questions that would invite a blundering response. Clinton, one could argue, does not need to win over the press—most mainstream outlets already embrace her nomination and are pushing hard for her election. A recent article by Paul Krugman in the Times is a prime example. Ordinary Americans, however, continue to be unimpressed.

Perhaps Clinton feels that she can win without trying. After all, she has said publicly to her supporters, “I stand between you and the apocalypse.” She is positioning herself as a better option for president than the apocalyptic one. But that’s not saying much. And perhaps that is the point.

Maybe Clinton thinks she does not need to win over ordinary Americans. She knows she has the support of the Wall Street elite, the Pentagon war hawks and even a growing number of Republicans, one of whom implored his fellow Republicans to save the party by voting for Clinton.

And yet all of that may not be enough, as the polls are showing. What Americans are looking for is bold, visionary thinking that acknowledges how broken Washington, D.C., is at our collective expense. The majority of Americans do not want measured, lukewarm progressive positions that keep systems intact. This is why Sanders, in calling for a “political revolution,” attracted so many new and independent voters, especially young millennials. This is why Trump is gaining traction, because between the two major-party candidates, his pathetic piñata-inspiring figure is offering the bolder rhetoric.

If Clinton loses this election, it will not be because Americans are dumb, racist misogynists who would cut off their noses to spite their faces in refusing to elect a sane woman over an insane man. It will not be because too many Americans “selfishly” voted for a third party or didn’t vote at all. It will be because Clinton refused to compromise her allegiance to Wall Street and the morally bankrupt center-right establishment positions of her party and chose not to win over voters. This election is hers to lose, and if this nation ends up with President Trump, it will be most of all the fault of Clinton and the Democratic Party that backs her.

Sonali Kolhatkar is the host and executive producer of Uprising, a daily radio program at KPFK Pacifica Radio, soon to be on Free Speech TV (click here for the campaign to televise Uprising). She is also the Director of the Afghan Women’s Mission, a US-based non-profit that supports women’s rights activists in Afghanistan and co-author of “Bleeding Afghanistan: Washington, Warlords, and the Propaganda of Silence.”