Graffiti Streetart in Berlin
by Mr. Dheo
Graffiti Streetart in Berlin
by Mr. Dheo
Google wants your money. Or, more precisely, Google wants your bank account and credit card info.
At Quartz, Chris Mims reports that Google appears to be accelerating its roll-out of a service that will allow gmail users to send money via email to whomever they want as easily as sending an attachment. Sounds great — but wait, there’s more!
Here’s what’s brilliant about offering the “send money” feature: Google almost certainly doesn’t care whether you use it to send money. What it cares about is getting you to sign up to Google Wallet and capture your bank account and credit-card information. And it’s using Gmail, which has a reach comparable to that of Facebook—425 million as of June 2012, the last time Google released numbers—to do it.
Once Google has your payment info, it can then implement PayPal-like functionality throughout the Google universe — YouTube, search, Maps, you name it. Anywhere you travel online while logged into your Google Account, you will have the ability to click-and-pay.
I can easily see this becoming popular. But here are three reasons to be wary.
1) Your Gmail account is already a hugely tempting target for hackers. Adding your financial info to that account will make it irresistible.
2) Google’s ability to effectively target ads already gives it tremendous power to manipulate consumer behavior. Adding the instant gratification of easy-checkout to those ads will make the company even more powerful.
3) Google already knows far too much about what we want, what we do, where we go, and who we communicate with. Do we really want to complete the chain and give the company our most intimate financial information?
The question posed by Google — and, really, all online Web services. At what point does convenience become vulnerability?
Those who own the country ought to govern it -- Founder John Jay
That quote may be apocryphal, but the sentiment has been with us since the beginning of the Republic. I think we all know he wasn’t talking about that amorphous mass known as “the people,” don’t you? Much better for the real stakeholders of democracy to be in charge. You know, the people with money and property.
And it appears as though they got what they wanted. According to this blockbuster study from Martin Gilens and Benjamin Gage, the fact that the people sometimes have their policy preferences enacted is purely a matter of coincidence: It only happens if it happens to coincide with the preferences of the wealthy. If not, we’re just out of luck. The practical result of that is that while the wealthy might view certain issues along progressives lines, such as gay rights or maybe even gun control, it’s highly unlikely they will ever allow us to get too far out on the egalitarian limb. They tend to draw the line at anything that affects their own interest. And that interest is to keep most of the money and all of the power within their own control, regardless of whether they see themselves as liberals or conservatives.
Take, for instance, what Thomas Frank in this piece astutely described as liberal Mugwumpery, the “reformist” strain among aristocrats in which they take it upon themselves to reform the state and better the characters of the lower orders. Yes, they may want to clean up government corruption and coerce the “unhealthy” to change their ways, whether through temperance or prohibition, but they cannot be counted upon to fully engage on issues that infringe on their own interests. Frank quotes historian Richard Hofstadter describing the “Mugwump types” of the 19th century:
[T]he most serious abuses of the unfolding economic order of the Gilded Age he either resolutely ignored or accepted complacently as an inevitable result of the struggle for existence or the improvidence and laziness of the masses. As a rule, he was dogmatically committed to the prevailing theoretical economics of laissez faire. . . . He imagined that most of the economic ills that were remediable at all could be remedied by free trade, just as he believed that the essence of government lay in honest dealing by honest and competent men.
Frank applied that term specifically to Michael Bloomberg, who has pledged to spend $50 million to defeat the NRA, a worthy cause if there ever was one. As he points out, however, as much as this particular pledge might benefit the population at large, Bloomberg will also use his power to defeat anything that looks to directly benefit working people economically. Just as the Gilens-Gage paper suggests, as long as the billionaires’ interests align with the people, there is a chance it might get done. Where they diverge, it might as well be impossible. That is a very odd definition of democracy.
Not all of our wealthy warriors for liberal causes are as openly hostile to the economic reforms at the center of the progressive agenda as a Wall Street billionaire like Bloomberg. In fact, many of them are probably just unaware of it, as they are the scions of great wealth who are flush with the idealism of youth and simply wish to make a difference. These Baby Mugwumps have good intentions, but somehow their focus also tends to be directed at worthy causes that don’t upset the economic apple cart.
For instance, these nice young would-be philanthropists who were invited to the White House last week to share their thoughts on how to fix the problems of the world:
“Moon shots!” one administration official said, kicking off the day on an inspirational note to embrace the White House as a partner and catalyst for putting their personal idealism into practice.
The well-heeled group seemed receptive. “I think it’s fantastic,” said Patrick Gage, a 19-year-old heir to the multibillion-dollar Carlson hotel and hospitality fortune. “I’ve never seen anything like this before.” Mr. Gage, physically boyish with naturally swooping Bieber bangs, wore a conservative pinstripe suit and a white oxford shirt. His family’s Carlson company, which owns Radisson hotels, Country Inns and Suites, T.G.I. Friday’s and other brands, is an industry leader in enforcing measures to combat trafficking and involuntary prostitution.
A freshman at Georgetown University, Mr. Gage was among the presenters at a breakout session, titled “Combating Human Trafficking,” that attracted a notable group of his peers. “The person two seats away from me was a Marriott,“ he said. “And when I told her about trafficking, right away she was like, ‘Uh, yeah, I want to do that.’ ”
Of course. Who wouldn’t be against human trafficking? Or limiting the proliferation of guns, either? But I think one can see with those two examples just how limited the scope of our patrician betters’ interest in the public good really is. Whether undertaken through the prism of their own self-interest or a youthful idealism born of privilege, it represents causes, not any real challenge to the status quo.
But what should we make of the latest audacious entry into the political arena? Sean Parker, Napster inventor and Facebook billionaire, announced that he’s jumping into politics with both feet. He’s not signing on to any specific cause or even a vague political philosophy. In fact, it’s almost impossible to figure out what it’s about.
One of the nice things about being a billionaire is that even if you have no idea about what you believe or any sense of how the political system works in theory or in practice you can meet with the actual players and have them explain it to you. That’s what Parker has been doing, meeting with politicians of such disparate ideologies as Rand Paul, Bill DeBlasio and Charlie Christ. I’m sure they all told him to put them on speed dial and to call day or night if he had any questions.
His plan, if one can call it that, makes the naive young heirs to the great fortunes look like grizzled old political veterans by comparison:
Unlike other politically-inclined billionaires, such as the conservative Koch brothers and liberal environmentalist Tom Steyer, Parker hopes to avoid a purely partisan role as he ventures more deeply into politics.
Having donated almost exclusively to Democrats up to this point, Parker made a trip to Washington in December for the purpose of meeting quietly with Republican officeholders and strategists around town. He plans to donate to both sides starting this year, associates say, for the first time committing big sums to aid Republicans he views as credible deal-makers in a bitterly divided Congress.
He’s not even a Mugwump. He’s just a mess. Apparently, he thinks he can “make Washington work” by financing a group of deal makers in both parties who will knock some heads together and get the job done. What, exactly, it is they are supposed to get done remains a mystery. Indeed, from the sound of it, it doesn’t really matter.
I have an idea for Parker. There’s a group of “activists” out there who are right up his alley. He could just buy their outfit outright and rebrand it to his liking. It’s called “No-Labels” and they’ve been flogging this idea of bipartisan nothingness for a very long time. For some unknown reason it just hasn’t taken hold with the public. But if anyone can market this dog to the public, the guy who made hundreds of millions for the singular advice to take “the” out of “The Facebook” seems like just the guy to make it happen.
According to the article, a battalion of opportunistic political consultants from across the ideological spectrum are already on the payroll and are going to make a whole lot of money from this quixotic venture, however it goes, so no matter what, I suppose he’s at least trickling some of his wealth down to lower orders. In the current political environment run by radical right-wing billionaires, Mugwumps and fools, that may be the best we can hope for.
Heather Digby Parton is a writer also well-known as “Digby.” Her political and cultural observations can be found at www.digbysblog.blogspot.com.
Earlier this month, the data firm Equilar published its list of the highest paid CEOs for 2013. The report paints a picture of vast inequality, with extraordinary sums being paid to a tiny layer of the population. Pay for the top-earning 100 CEOs in the US increased by 9 percent last year, to $13.9 million a piece.
Topping the list was Larry Ellison, the CEO of Oracle Corporation, a man who exemplifies the social character of the ruling class and its manner of wealth accumulation.
Ellison’s total compensation for 2013 was $78.4 million, almost all of it in stock options. For the eight years that Equilar has tracked executive compensation, Ellison’s cumulative pay was $582 million, almost $83 million more than the runner-up, Tim Cook of Apple. His pay in 2013 was more than double that of the runner-up for that year, CEO Bob Iger of Walt Disney Company, paid $34.3 million.
Ellison’s pay was actually down $18 million from its high in 2012, perhaps a reflection of the slowing performance of Oracle’s stock. Ellison’s wealth consists largely of real estate, and his fortunes have been amassed primarily through the medium of the stock market—a practice that has become pervasive among the ruling class since the 1980s, and vastly accelerated by the policies of the Federal Reserve.
Indeed, Ellison is one of the intended beneficiaries of the Obama administration’s policy since the 2008 crash, which has consisted of making available an unlimited stream of cash to the financial system. The stock market has soared as a result, even as pay for the vast majority of the population has stagnated or declined, and unemployment remains at catastrophic levels.
Oracle Corp. is a developer of business software, such as supply chain management and enterprise resource planning, founded by Ellison and Bob Miner. One of the famed Silicon Valley startups, the company is now a tech giant, with revenues second only to Microsoft in the world of software development.
Ellison runs his company as something of a despot, even over its shareholders. Though Ellison himself owns only about a quarter of the company’s stock, their votes to roll back his pay package in two consecutive years were offered as nothing more than nonbinding suggestions, and promptly ignored.
Nominal management of the company has catapulted Ellison up the ranks of the super-rich. He is now the fifth-wealthiest person in the world, with a personal net-worth of about $50 billion. Ellison is a personification of the obscenity of contemporary capital accumulation, the object of a fawning media and his fellow aristocrats. A New York Times feature on the CEO recently proclaimed, “It is good to be the king. It is even better to be Larry Ellison.” In this they have surely not exaggerated.
Oracle’s CEO is well-known for his egomaniacal and costly pet projects. His recent sponsorship of the America’s Cup race in San Francisco at a cost of $100 million is a small expense compared with some of his previous undertakings, such as spending $250 million to buy up a third of Malibu.
In 2004 he commissioned a gargantuan custom “superyacht” for $377 million, and his 2012 purchase of the sixth-largest island in the Hawaiian chain, Lanai, was paid for with between $500 million and $600 million—cash. He continues to make changes and “develop” the island as a “model for environmentally sound living.”
Frugal in his own way, Ellison managed to cheat San Mateo County out of $3 million of property taxes by arguing that one of his mansions, an imitation of a Japanese Shogun estate, was “functionally obsolete” and worth hundreds of millions less than he paid for it. As a result, the public school system lost some $1.4 million in tax dollars.
Like many billionaires of his type, he gives freely to both political parties, including some of the most influential figures at the national level, such as Democratic Party House Majority Whip Kevin McCarthy and former Senator, now Secretary of State in the Obama administration, John Kerry.
Breaking down Ellison’s pay on the assumption of a 40-hour workweek means that the CEO makes more in an hour ($37,692.31) than a typical worker is likely to in a year—paying the much lower capital gains tax rate on almost all of it.
The tragedy of this spectacle hardly needs to be pointed out, when 12.5 percent of the globe’s population sits on the cusp of starvation. In the United States, Ellison’s home, a third of Americans experienced “poverty” in its narrow official definition between 2009 and 2011.
In a period when the most basic forms of assistance to the working class—food stamps, unemployment insurance, education, health care, nurturing of culture—are being systematically dismantled to free up resources for the accumulation of capital by the aristocracy of finance, the likes of Oracle Corp.’s “king” are the incarnation of reaction all down the line.
In the early years of Oracle, possessing a fortune several orders of magnitude smaller than today, Ellison invited the company’s co-founder Bob Miner to come along with him for a joyride on a hired fighter jet. Miner wrote back, perhaps more prophetically than he intended, “You obviously have far more money than you should. It’s things like this that caused the French Revolution.”
What goes for Ellison goes for the social layer of which he is a particular expression. Drunk on wealth, their relationship to society as a whole is fundamentally parasitic. They sit atop a social power keg—not unlike the aristocracy of the ancien regime. And similar causes produce similar effects.
|April 21, 2014|
|Folks who find themselves rich — and getting richer — typically tend to react psychologically in one of two ways. They can either see themselves as incredibly fortunate or incredibly deserving.
Those rich who come to see the luck in their lives also usually come to understand that they have no more talent — and work no more diligently — than plenty of people who hold not much wealth at all. Those among the rich who see their wealth as a well-deserved reward, on the other hand, often come to see those without wealth as undeserving — of anything except contempt.
The latest sign of that contempt: The rush by localities to criminalize sleeping in cars. Communities where the wealthy predominate — like Palo Alto in Silicon Valley — have been particularly eager to do this criminalizing. Palo Alto last year had over 12 times more homeless people than available shelter beds.
We have housing meanness in America today. We also have housing madness. This week’s Too Much has a bizarre example of the latter. And lots more, too.
|GREED AT A GLANCE|
|Last year’s congressional “fiscal cliff” deal raised the tax rate on ordinary income over $450,000 from 35 to 39.6 percent — and the tax on capital gains from 15 to 20 percent. On paper, taxpayers making over $2.6 million a year — America’s top 0.1 percent — should now each be paying about $232,000 more in federal taxes this year than last. In real life, many may not. The reason: Federal budget cuts have left the IRS, the Associated Press reports, with “fewer agents auditing returns than at any time since at least the 1980s.” IRS funding this year has dropped 7 percent. Last year the agency audited only 10.9 percent of taxpayers with over $1 million in income. This year’s audit rate, IRS chief John Koskinen acknowledged last week, will run lower. That will mean that “some people we should catch,” says Koskinen, “we’re not going to catch.”
They pay Yahoo CEO Marissa Mayer the big bucks — $24.9 million last year — to make the big decisions. Mayer two years ago decided to hire Henrique De Castro to overhaul her stumbling company’s operations. De Castro, Mayer led everyone to believe, “had a unique set of highly valuable skills and experiences.” Fifteen months later, after still more corporate stumbling, Mayer gave Mr. Unique the heave-ho. De Castro is walking away, Yahoo has just disclosed, with $58 million in severance. Shareholders seem none too happy about that. Mayer’s answer? To calm down anger over Yahoo’s executive pay giveaways, she’s bringing on to the Yahoo board H. Lee Scott Jr., the retired Wal-Mart CEO. In his last year as Wal-Mart’s chief, Scott took home $30.2 million, over 1,500 times average Wal-Mart worker pay . . .
For generations, the Guardian reported earlier this month, “fine dining” has meant “haughty waiters, hushed rooms, starched table linen, and endless interruptions to pour wine and water.” But celebrity chefs these days are going casual. At the UK’s uber trendy House of Tides, for instance, Michelin-starred chef Kenny Atkinson has “dispensed” with “hovering waiters” — and has diners pouring their own wine. And no fancy-pants dress code either. Atkinson doesn’t mind if people come dressed in jeans: “Their money is as good as anybody else’s.” Any diners in jeans will need, of course, to bring plenty of the money Atkinson so prizes. His tasting menu starts at $92.
Quote of the Week
“Even in this era of extreme partisanship, broad bipartisan agreement supports an agenda that helps the wealthy — including austerity budgets, free trade, big bank bailouts, and policing the world. Big Oil, Big Pharma, Big Agra, the health insurance industry, and Wall Street deploy legions of lobbyists to make it clear that messing with them costs dearly.”
|PETULANT PLUTOCRAT OF THE WEEK|
|Recreation? Howard Lutnick, the CEO at high-finance power Cantor Fitzgerald, can’t get enough of it. The 40-acre Hamptons estate he bought back in 2003 — at a cost of $56 million — came with a swimming pool, spa, and tennis court. Lutnick moved quickly to add a basketball court and a barn big enough to house an equestrian team. But three different zoning and planning boards refused to grant the permits for Lutnick’s additions. The chief exec then sued the boards — and all their individual members — for $56 million in damages. Wall Streeters who frolic in the Hamptons every summer, one of those sued noted last week, “don’t like to be told what you can or can’t do.” Lutnick’s lawsuits, he added, amount to an attempt at “’intimidation plain and simple.”||
|IMAGES OF INEQUALITY|
Get the dust mops ready. This French-style chateau in Connecticut’s New Canaan has gone empty the last 60-odd years. But fashion designer Reed Krakoff has just picked up the 52-acre estate for a relative song, a mere $14 million. The manse went on the market after the 2011 death of the long-time owner, Hugette Clark, the 104-year-old copper heiress. Clark had bought the manse in 1951 as a refuge in case the Russkies ever threatened to drop an atom bomb on her Manhattan isle home. She never moved in. She never even furnished the place.
|PROGRESS AND PROMISE|
|With vivid graphics and first-person worker testimonials, the 2014 edition of the AFL-CIO’s online PayWatch is putting some needed new pressure on executive pay excess. CEOs at companies listed in the S&P 500, the labor site calculates, took home 331 times the pay of average American workers last year — and 774 times the take-home of minimum-wage workers. The backdrop for this gap: Corporate profits in 2013 — for the nation’s top 500 corporations — averaged $41,249 per employee, a 38 percent jump over the profit level in 2008. If the minimum wage had kept up with top 1 percent income gains since 1968, adds the new PayWatch, minimum-wage workers would now be making $31.45 per hour.||
At the new PayWatch site, Americans working at major firms can compare their pay to their CEO’s compensation — and CEO pay in their state. Spread the word and help build the charge for a smaller corporate pay gap.
|inequality by the numbers|
Stat of the Week
Oracle CEO Larry Ellison leads the latest New York Times list of America’s most highly paid CEOs. Ellison makes more per second, notes analyst Deborah Meier, then the current federal hourly minimum wage.
Why Our Sky Sometimes Does Start Falling
You don’t have to be a rocket scientist to be able to demonstrate the link between inequality and catastrophic environmental change. But a little help from rocket scientists can certainly help.
The sky, we all learn as children, is not falling — and never falls. Only silly Chicken Littles prattle about “precipitous collapses.”
Only silly Chicken Littles, apparently, and applied mathematicians.
One of those mathematicians, the University of Maryland’s Safa Motesharrei, has joined with two colleagues to publish a new paper that sees the “precipitous collapse” of our global order as a distinct possibility.
In fact, the three conclude, that possibility will become a hard-to-avoid probability unless the world becomes a far less unequal place.
Motesharrei and his fellow researchers — University of Maryland meteorologist Eugenia Kalnay, a former Goddard Space Flight Center exec, and University of Minnesota political scientist Jorge Rivas — reached that conclusion after running “a range of hypothetical scenarios” through an innovative model developed with funding support from NASA, America’s space agency.
Scientists at NASA usually spend their time looking up at the heavens. The investigators behind this new study kept their focus distinctly earth-bound.
Sophisticated human civilizations, the three investigators point out, have in the past collapsed and on a fairly regular basis. The Romans broke down, as did the Han in China and the Gupta in India, the Maya in Central America, and a variety of Mesopotamian civilizations.
These collapses, Motesharrei and his collaborators note, naturally raise the question whether we today remain “similarly susceptible.” Or can our modern civilization, with all our “greater technological capacity, scientific knowledge, and energy resources,” survive whatever did in our sophisticated predecessors?
And what did do in these predecessors? In previous collapses, we see some similar patterns. The doomed societies overextended themselves environmentally. They depleted their natural resources at an unsustainable pace — and failed to see, despite their sophistication, the warning signs of their impending implosion. They soldiered on, oblivious to the danger.
Or rather, to be more precise, the elite movers and shakers of these societies soldiered on. In deeply unequal societies, elites seldom feel the strain and pain that environmental degradation engenders — until that degradation has gone too far to reverse.
This “buffer of wealth,” as the Motesharrei team puts it, “allows elites to continue ‘business as usual.’”
Could we go down the same clueless path? The Motesharrei study explores that question with “the first model of its kind that studies the impacts of inequality on the fate of societies.” Under conditions “closely reflecting the reality of the world today,” the study finds an eventual collapse “difficult to avoid.”
Difficult but not impossible. We can avoid collapse, the Motesharrei paper notes, if we reduce the “per capita rate of depletion of nature” to a “sustainable level” and start distributing resources in a more “reasonably equitable fashion.”
For specifics on what that would actually mean in practice, we need to look elsewhere. The new Motesharrei paper soars at a theoretical level and offers no practical roadmap to a more sustainable and equal society.
Other analysts have made that effort, no one more so than Herman Daly, the former World Bank senior economist now widely considered the founding father of ecological economics.
Daly and his colleagues have been calling for a “steady-state economy” that acknowledges the limits of our physical world. We can’t go on, they contend, endlessly depleting our stocks of natural resources and polluting the world with the wastes from our resource exploitation.
“We need,” says Daly, “to build the physical constraints of a finite biophysical environment into our economic theory.”
And into that theory, he adds, we need to build justice, too. This past fall, Daly spelled out ten specific steps that could help us push back against our current unsteady state. Among them: “Limit the range of inequality in income distribution with a minimum income and a maximum income.”
“Rich and poor separated by a factor of 500,” Daly observes, “have few experiences or interests in common.”
Maybe not even, the new Motesharrei study suggests, avoiding a cataclysmic environmental collapse.
Matt Taibbi, The Super Rich in America Have Become ‘Untouchables’ Who Don’t Go to Prison, Democracy Now! April 15, 2014. Our income gap reflects a “justice” gap.
Susan Holmberg, The Pay’s the Thing: How America’s CEOs Are Getting Rich Off Taxpayers, Next New Deal, April 16, 2014. The price we pay for tolerating the performance pay loophole.
Robert Reich, Antitrust in the New Gilded Age, April 16, 2014. America is facing the same concentrations of wealth and economic power that endangered democracy a century ago.
Howard Steven Friedman, American Inequality: Ticking Time Bomb, Huffington Post, April 17, 2014. Plutarch had it right: An “imbalance between rich and poor” remains our most “fatal ailment.”
Floyd Norris, Merely Rich and Superrich: The Tax Gap Is Narrowing, New York Times, April 17, 2014. A step toward a tax policy less hostile to work.
Will Hutton, Extravagant CEO pay doesn’t reflect performance, it’s all about status, Observer, April 19, 2014. We now live in an era of “conspicuous executive pay,” only understandable as a social phenomenon because its excess has ceased to have any economic logic.
Zoë Carpenter, Will Phony Populists Hijack the Fight Against Inequality? Nation, April 21, 2014. Plenty of top Dems are still arguing that wealth’s concentration doesn’t really matter.
|NEW AND notable|
Do Americans Still Live in a Real Democracy?
Martin Gilens and Benjamin Page, Testing Theories of American Politics: Elites, Interest Groups, and Average Citizens, Perspectives on Politics, forthcoming Fall 2014.
America’s political scientists have been arguing for generations over who really runs the United States. Do Americans have a democracy where the people rule? Or something else? Do American citizens, as political scientists Martin Gilens and Benjamin Page ask in this blockbuster new paper just published online, rate as “sovereign, semi-sovereign, or largely powerless”?
Gilens and Page, distinguished professors from Princeton and Northwestern, give a surprisingly chilling response to that question, based on their analysis of “a unique data set” that culled responses to 1,779 policy questions pollsters asked between 1981 and 2002.
The researchers crunched this set of response data by income level and then probed to see whose policy preferences actually prevailed.
“Economic elites and organized groups representing business interests have substantial independent impacts on U.S. government policy,” they conclude, “while mass-based interest groups and average citizens have little or no independent influence.”
One analyst is already calling this new Gilens-Page paper “the first-ever scientific study” of the question whether the United States can rightfully claim to be a democracy. Gilens and Page, for their part, pull no academic punches.
“Our analyses suggest that majorities of the American public,” the pair write, “actually have little influence over the policies our government adopts.”
The nation, the scholars note, does sport “many features central to democratic governance,” everything from elections to freedom of speech and association.
“But we believe that if policy making is dominated by powerful business organizations and a small number of affluent Americans,” they go on to add, “then America’s claims to being a democratic society are seriously threatened.”
Researchers at Princeton and Northwestern universities have pored over 1,800 US policies and concluded that America is an oligarchy. Instead of looking out for the majority of the country’s citizens, the US government is ruled by the interests of the rich and the powerful, they found. No great surprises there, then.
But the government is not the only American power whose motivations need to be rigourously examined. Some 2,400 miles away from Washington, in Silicon Valley, Google is aggressively gaining power with little to keep it in check.
It has cosied up to governments around the world so effectively that its chairman, Eric Schmidt, is a White House advisor. In Britain, its executives meet with ministers more than almost any other corporation.
Google can’t be blamed for this: one of its jobs is to lobby for laws that benefit its shareholders, but it is up to governments to push back. As things stand, Google – and to a lesser extent, Facebook – are in danger of becoming the architects of the law.
Meanwhile, these companies are becoming ever more sophisticated about the amount of information they access about users. Google scans our emails. It knows where we are. It anticipates what we want before we even know it. Sure there are privacy settings and all that, but surrendering to Google also feels nigh on impossible to avoid if you want to live in the 21st century. It doesn’t stop there either. If Google Glass is widely adopted, it will be able to clock everything we see, while the advance of Google Wallet could position the company at the heart of much of the world’s spending.
One source at the technology giant put it well when she referred to the company as an “unelected superpower”. I think this is a fair summary. So far, we are fortunate that that dictatorship is a relatively benign one. The company’s mantra is “do no evil”, and while people may disagree on what evil means, broadly speaking, its founders are pretty good guys. But Larry Page and Sergey Brin will not be around forever. Nor should we rely on any entity that powerful to regulate its own behaviour.
The government in America, and its counterparts around the world, should stop kowtowing to Google and instead work in concert to keep this and any other emerging corporate superpowers firmly in check.
From the Organization Man to open-plan design, a new history of the way the office has shaped the modern world
Over the past week, as I’ve been carrying around a copy of Nikil Saval’s “Cubed: A Secret History of the Workplace,” I’ve gotten some quizzical looks. “It’s a history of the office,” I’d explain, whereupon a good number of people would respond, “Well, that sounds boring.”
It isn’t. In fact, “Cubed” is anything but, despite an occasional tendency to read like a grad school thesis. The fact that anyone would expect it to be uninteresting is striking, though, and marks an unexpected limit to the narcissistic curiosity of the average literate American. The office, after all, is where most contemporary labor takes place and is a crucible of our culture. We’ve almost all worked in one. Of course it’s a subject that merits a thoroughly researched and analytical history, because the history of the office is also the history of us.
Saval’s book glides smoothly between his two primary subjects: the physical structure of offices and the social institution of white-collar work over the past 150 years or so. “Cubed” encompasses everything from the rise of the skyscraper to the entrance of women into the professional workplace to the mid-20th-century angst over grey-flannel-suit conformity to the dorm-like “fun” workplaces of Silicon Valley. His stance is skeptical, a welcome approach given that most writings on the contemporary workplace are rife with dubious claims to revolutionary innovation — office design or management gimmicks that bestselling authors indiscriminately pounce on like magpies seizing glittering bits of trash.
Some of the most fascinating parts of “Cubed” come in the book’s early chapters, in which Saval traces the roots of the modern office to the humble 19th-century countinghouse. Such firms — typically involved in the importation or transport of goods — were often housed in a single room, with one or more of the partners working a few feet away from a couple of clerks, who copied and filed documents, as well as a bookkeeper. A novice clerk might earn less than skilled manual laborers, but he had the potential to make significantly more, and he could boast the intangible but nevertheless significant status of working with his mind rather than his hands.
Even more formative to the identity of white-collar workers (so named for the detachable collars, changed daily in lieu of more regular washings of the actual shirt, that served as a badge of the class) was the proximity to the boss himself and the very real possibility of advancement to the role of partner. Who better to marry the boss’ daughter and take over when he retired? Well, one of the other clerks is who, and in this foundational rivalry much of the character of white-collar work was set. “Unlike their brothers in the factory, who had begun to see organizing on the shop floors as a way to counter the foul moods and arbitrary whims of their bosses,” Saval writes, “clerks saw themselves as potential bosses.” Blue-collar workers, by contrast, knew they weren’t going to graduate to running the company one day.
The current American ethic of self-help and individualism owes at least as much to the countinghouse as it does to the misty ideal of the Jeffersonian yeoman farmer. An ambitious clerk identified and sought to ingratiate himself with the partners, not his peers, who were, after all, his competitors. He was on his own, and had only himself to blame if he failed. A passion for self-education and self-improvement, via books and night schools and lecture series, took root and flourished. So did a culture of what Saval calls “unctuous male bonding,” the ancestor of the 20th century’s golf outings and three-martini lunches, customs that would make it that much harder for outsiders like women and ethnic minorities to rise in the ranks — once they managed to get into the ranks in the first place.
The meritocratic dream became a lot more elusive in the 1920s, when the population employed in white-collar work surpassed the number of blue-collar workers for the first time. Saval sees this stage in the evolution of the office as epitomized in the related boom in skyscrapers. The towering buildings, enabled by new technologies like elevators and steel-frame construction, were regarded as the concrete manifestation of American boldness and ambition; certainly modern architects, reaching for new heights of self-aggrandizement, encouraged that view. They rarely acknowledged that a skyscraper was, at heart, a hive of standardized cells in which human beings were slotted like interchangeable parts. Most of the workers inhabiting those cells, increasingly female, could not hope to climb to the executive suites.
Office workers had always made observers uncomfortable; the clerk, with his “minute leg, chalky face and hollow chest,” was one of the few members of the American multitude that Walt Whitman scorned to contain. After World War II, that unease grew into a nationwide obsession, bumping several works of not-so-pop sociology — “The Lonely Crowd,” “The Man in the Grey Flannel Suit,” “The Organization Man,” etc. — onto the bestseller lists. In turn, the challenge of managing this new breed of “knowledge workers” became the subject of intensive rumination and theorizing. Saval lists Douglas McGregor’s 1960 guide, “The Human Side of Enterprise,” as the seminal work in a field that would spawn such millionaire gurus as Tom Peters and Peter Drucker.
Office design — typically regimented rows of identical desks — was deemed in need of an overhaul as well, and perhaps the most rueful story Saval recounts is that of Robert Propst, hired to head the research wing of the Herman Miller company in 1958. Propst made an intensive study of white-collar work and in 1964, Herman Miller unveiled his Action Office, a collection of pieces designed to support work on “thinking projects.” Saval praises this incarnation of the Action Office as the first instance in which ” the aesthetics of design and progressive ideas about human needs were truly united,” but it didn’t entirely catch on until the unveiling of Action Office II, which incorporated wooden frames covered with fabric that served as short, modular, temporary walls.
Oh, what a difference 30 degrees makes! Propst’s original Action Office II arranged these walls in 120 degree-angles, an orthogonal configuration that looked and felt open and dynamic. One of Propst’s colleagues told Saval of the dismal day that “someone figured out you don’t need the 120-degree [angle] and it went click.” Ninety-degrees used up the available space much more efficiently, enabling employers to cram in more workstations. The American office worker had been cubed.
The later chapters of “Cubed” speak to more recent trends, like the all-inclusive, company-town-like facilities of tech firms like Google and wacky experiments in which no one is allowed to have a permanent desk at all. Saval visits the Los Angeles office of the ad agency Chiat/Day, which is organized around an artificial main street, features a conference table made of surfboards and resembles nothing so much as a theme park. Using architecture and amenities to persuade employees that their work is also play is a gambit to keep them on the premises and producing for as much of the day (and night) as possible. It all seems a bit desperate. In my own personal experience, if people 1) are paid sufficiently; 2) like the other people they work with; and 3) find the work they do a meaningful use of their energy and skills, then they don’t really care if they work in cubicles or on picnic benches. Item No. 3 is a bitch, though, the toughest criteria for most contemporary corporations to meet. Maybe that’s what they ought to be worrying about instead.
Photo Credit: Jason Stitt/Shutterstock.com
The following is an excerpt from “Overpowered: What Science Tells Us About the Dangers of Cell Phones and Other Wifi-age Devices” by Martin Blank, PhD. Published by Seven Stories Press, March 2014. ISBN 978-1-60980-509-8. All rights reserved.
This excerpt was originally published by Salon.com.
You may not realize it, but you are participating in an unauthorized experiment—“the largest biological experiment ever,” in the words of Swedish neuro-oncologist Leif Salford. For the first time, many of us are holding high-powered microwave transmitters—in the form of cell phones—directly against our heads on a daily basis.
Cell phones generate electromagnetic fields (EMF), and emit electromagnetic radiation (EMR). They share this feature with all modern electronics that run on alternating current (AC) power (from the power grid and the outlets in your walls) or that utilize wireless communication. Different devices radiate different levels of EMF, with different characteristics.
What health effects do these exposures have?
Therein lies the experiment.
The many potential negative health effects from EMF exposure (including many cancers and Alzheimer’s disease) can take decades to develop. So we won’t know the results of this experiment for many years—possibly decades. But by then, it may be too late for billions of people.
Today, while we wait for the results, a debate rages about the potential dangers of EMF. The science of EMF is not easily taught, and as a result, the debate over the health effects of EMF exposure can get quite complicated. To put it simply, the debate has two sides. On the one hand, there are those who urge the adoption of a precautionary approach to the public risk as we continue to investigate the health effects of EMF exposure. This group includes many scientists, myself included, who see many danger signs that call out strongly for precaution. On the other side are those who feel that we should wait for definitive proof of harm before taking any action. The most vocal of this group include representatives of industries who undoubtedly perceive threats to their profits and would prefer that we continue buying and using more and more connected electronic devices.
This industry effort has been phenomenally successful, with widespread adoption of many EMF-generating technologies throughout the world. But EMF has many other sources as well. Most notably, the entire power grid is an EMF-generation network that reaches almost every individual in America and 75% of the global population. Today, early in the 21st century, we find ourselves fully immersed in a soup of electromagnetic radiation on a nearly continuous basis.
What we know
The science to date about the bioeffects (biological and health outcomes) resulting from exposure to EM radiation is still in its early stages. We cannot yet predict that a specific type of EMF exposure (such as 20 minutes of cell phone use each day for 10 years) will lead to a specific health outcome (such as cancer). Nor are scientists able to define what constitutes a “safe” level of EMF exposure.
However, while science has not yet answered all of our questions, it has determined one fact very clearly—all electromagnetic radiation impacts living beings. As I will discuss, science demonstrates a wide range of bioeffects linked to EMF exposure. For instance, numerous studies have found that EMF damages and causes mutations in DNA—the genetic material that defines us as individuals and collectively as a species. Mutations in DNA are believed to be the initiating steps in the development of cancers, and it is the association of cancers with exposure to EMF that has led to calls for revising safety standards. This type of DNA damage is seen at levels of EMF exposure equivalent to those resulting from typical cell phone use.
The damage to DNA caused by EMF exposure is believed to be one of the mechanisms by which EMF exposure leads to negative health effects. Multiple separate studies indicate significantly increased risk (up to two and three times normal risk) of developing certain types of brain tumors following EMF exposure from cell phones over a period of many years. One review that averaged the data across 16 studies found that the risk of developing a tumor on the same side of the head as the cell phone is used is elevated 240% for those who regularly use cell phones for 10 years or more. An Israeli study found that people who use cell phones at least 22 hours a month are 50% more likely to develop cancers of the salivary gland (and there has been a four-fold increase in the incidence of these types of tumors in Israel between 1970 and 2006). And individuals who lived within 400 meters of a cell phone transmission tower for 10 years or more were found to have a rate of cancer three times higher than those living at a greater distance. Indeed, the World Health Organization (WHO) designated EMF—including power frequencies and radio frequencies—as a possible cause of cancer.
While cancer is one of the primary classes of negative health effects studied by researchers, EMF exposure has been shown to increase risk for many other types of negative health outcomes. In fact, levels of EMF thousands of times lower than current safety standards have been shown to significantly increase risk for neurodegenerative diseases (such as Alzheimer’s and Lou Gehrig’s disease) and male infertility associated with damaged sperm cells. In one study, those who lived within 50 meters of a high voltage power line were significantly more likely to develop Alzheimer’s disease when compared to those living 600 meters or more away. The increased risk was 24% after one year, 50% after 5 years, and 100% after 10 years. Other research demonstrates that using a cell phone between two and four hours a day leads to 40% lower sperm counts than found in men who do not use cell phones, and the surviving sperm cells demonstrate lower levels of motility and viability.
EMF exposure (as with many environmental pollutants) not only affects people, but all of nature. In fact, negative effects have been demonstrated across a wide variety of plant and animal life. EMF, even at very low levels, can interrupt the ability of birds and bees to navigate. Numerous studies link this effect with the phenomena of avian tower fatalities (in which birds die from collisions with power line and communications towers). These same navigational effects have been linked to colony collapse disorder (CCD), which is devastating the global population of honey bees (in one study, placement of a single active cell phone in front of a hive led to the rapid and complete demise of the entire colony). And a mystery illness affecting trees around Europe has been linked to WiFi radiation in the environment.
There is a lot of science—highquality, peer-reviewed science—demonstrating these and other very troubling outcomes from exposure to electromagnetic radiation. These effects are seen at levels of EMF that, according to regulatory agencies like the Federal Communications Commission (FCC), which regulates cell phone EMF emissions in the United States, are completely safe.
An unlikely activist
I have worked at Columbia University since the 1960s, but I was not always focused on electromagnetic fields. My PhDs in physical chemistry from Columbia University and colloid science from the University of Cambridge provided me with a strong, interdisciplinary academic background in biology, chemistry, and physics. Much of my early career was spent investigating the properties of surfaces and very thin films, such as those found in a soap bubble, which then led me to explore the biological membranes that encase living cells.
I studied the biochemistry of infant respiratory distress syndrome (IRDS), which causes the lungs of newborns to collapse (also called hyaline membrane disease). Through this research, I found that the substance on the surface of healthy lungs could form a network that prevented collapse in healthy babies (the absence of which causes the problem for IRDS sufferers).
A food company subsequently hired me to study how the same surface support mechanism could be used to prevent the collapse of the air bubbles added to their ice cream. As ice cream is sold by volume and not by weight, this enabled the company to reduce the actual amount of ice cream sold in each package. (My children gave me a lot of grief about that job, but they enjoyed the ice cream samples I brought home.)
I also performed research exploring how electrical forces interact with the proteins and other components found in nerve and muscle membranes. In 1987, I was studying the effects of electric fields on membranes when I read a paper by Dr. Reba Goodman demonstrating some unusual effects of EMF on living cells. She had found that even relatively weak power fields from common sources (such as those found near power lines and electrical appliances) could alter the ability of living cells to make proteins. I had long understood the importance of electrical forces on the function of cells, but this paper indicated that magnetic forces (which are a key aspect of electromagnetic fields) also had significant impact on living cells.
Like most of my colleagues, I did not think this was possible. By way of background, there are some types of EMF that everyone had long acknowledged are harmful to humans. For example, X-rays and ultraviolet radiation are both recognized carcinogens. But these are ionizing forms of radiation. Dr. Goodman, however, had shown that even non-ionizingradiation, which has much less energy than X-rays, was affecting a very basic property of cells—the ability to stimulate protein synthesis.
Because non-ionizing forms of EMF have so much less energy than ionizing radiation, it had long been believed that non-ionizing electromagnetic fields were harmless to humans and other biological systems. And while it was acknowledged that a high enough exposure to non-ionizing EMF could cause a rise in body temperature—and that this temperature increase could cause cell damage and lead to health problems—it was thought that low levels of non-ionizing EMF that did not cause this rise in temperature were benign.
In over 20 years of experience at some of the world’s top academic institutions, this is what I’d been taught and this is what I’d been teaching. In fact, my department at Columbia University (like every other comparable department at other universities around the world) taught an entire course in human physiology without even mentioning magnetic fields, except when they were used diagnostically to detect the effects of the electric currents in the heart or brain. Sure magnets and magnetic fields can affect pieces of metal and other magnets, but magnetic fields were assumed to be inert, or essentially powerless, when it came to human physiology.
As you can imagine, I found the research in Dr. Goodman’s paper intriguing. When it turned out that she was a colleague of mine at Columbia, with an office just around the block, I decided to follow up with her, face-to-face. It didn’t take me long to realize that her data and arguments were very convincing. So convincing, in fact, that I not only changed my opinion on the potential health effects of magnetism, but I also began a long collaboration with her that has been highly productive and personally rewarding.
During our years of research collaboration, Dr. Goodman and I published many of our results in respected scientific journals. Our research was focused on the cellular level—how EMF permeate the surfaces of cells and affect cells and DNA—and we demonstrated several observable, repeatable health effects from EMF on living cells. As with all findings published in such journals, our data and conclusions were peer reviewed. In other words, our findings were reviewed prior to publication to ensure that our techniques and conclusions, which were based on our measurements, were appropriate. Our results were subsequently confirmed by other scientists, working in other laboratories around the world, independent from our own.
A change in tone
Over the roughly 25 years Dr. Goodman and I have been studying the EMF issue, our work has been referenced by numerous scientists, activists, and experts in support of public health initiatives including the BioInitiative Report, which was cited by the European Parliament when it called for stronger EMF regulations. Of course, our work was criticized in some circles, as well. This was to be expected, and we welcomed it—discussion and criticism is how science advances. But in the late 1990s, the criticism assumed a different character, both angrier and more derisive than past critiques.
On one occasion, I presented our findings at a US Department of Energy annual review of research on EMF. As soon as I finished my talk, a well-known Ivy League professor said (without any substantiation) that the data I presented were “impossible.” He was followed by another respected academic, who stated (again without any substantiation) that I had most likely made some “dreadful error.” Not only were these men wrong, but they delivered their comments with an intense and obvious hostility.
I later discovered that both men were paid consultants of the power industry—one of the largest generators of EMF. To me, this explained the source of their strong and unsubstantiated assertions about our research. I was witnessing firsthand the impact of private, profit-driven industrial efforts to confuse and obfuscate the science of EMF bioeffects.
Not the first time
I knew that this was not the first time industry opposed scientific research that threatened their business models. I’d seen it before many times with tobacco, asbestos, pesticides, hydraulic fracturing (or “fracking”), and other industries that paid scientists to generate “science” that would support their claims of product safety.
That, of course, is not the course of sound science. Science involves generating and testing hypotheses. One draws conclusions from the available, observable evidence that results from rigorous and reproducible experimentation. Science is not sculpting evidence to support your existing beliefs. That’s propaganda. As Dr. Henry Lai (who, along with Dr. Narendra Singh, performed the groundbreaking research demonstrating DNA damage from EMF exposure) explains, “a lot of the studies that are done right now are done purely as PR tools for the industry.”
An irreversible trend
Of course EMF exposure—including radiation from smart phones, the power lines that you use to recharge them, and the other wide variety of EMF-generating technologies—is not equivalent to cigarette smoking. Exposure to carcinogens and other harmful forces from tobacco results from the purely voluntary, recreational activity of smoking. If tobacco disappeared from the world tomorrow, a lot of people would be very annoyed, tobacco farmers would have to plant other crops, and a few firms might go out of business, but there would be no additional impact.
In stark contrast, modern technology (the source of the humanmade electromagnetic fields discussed here) has fueled a remarkable degree of innovation, productivity, and improvement in the quality of life. If tomorrow the power grid went down, all cell phone networks would cease operation, millions of computers around the world wouldn’t turn on, and the night would be illuminated only by candlelight and the moon—we’d have a lot less EMF exposure, but at the cost of the complete collapse of modern society.
EMF isn’t just a by-product of modern society. EMF, and our ability to harness it for technological purposes, is the cornerstone of modern society. Sanitation, food production and storage, health care—these are just some of the essential social systems that rely on power and wireless communication. We have evolved a society that is fundamentally reliant upon a set of technologies that generate forms and levels of electromagnetic radiation not seen on this planet prior to the 19th century.
As a result of the central role these devices play in modern life, individuals are understandably predisposed to resist information that may challenge the safety of activities that result in EMF exposures. People simply cannot bear the thought of restricting their time with— much less giving up—these beloved gadgets. This gives industry a huge advantage because there is a large segment of the public that would rather not know.
My message is not to abandon gadgets—like most people, I too love and utilize EMF-generating gadgets. Instead, I want you to realize that EMF poses a real risk to living creatures and that industrial and product safety standards must and can be reconsidered. The solutions I suggest are not prohibitive. I recommend that as individuals we adopt the notion of “prudent avoidance,” minimizing our personal EMF exposure and maximizing the distance between us and EMF sources when those devices are in use. Just as you use a car with seat belts and air bags to increase the safety of the inherently dangerous activity of driving your car at a relatively high speed, you should consider similar risk-mitigating techniques for your personal EMF exposure.
On a broader social level, adoption of the Precautionary Principle in establishing new, biologically based safety standards for EMF exposure for the general public would be, I believe, the best approach. Just as the United States became the first nation in the world to regulate the production of chlorofluorocarbons (CFCs) when science indicated the threat to earth’s ozone layer—long before there was definitive proof of such a link—our governments should respond to the significant public health threat of EMF exposure. If EMF levels were regulated just as automobile carbon emissions are regulated, this would force manufacturers to design, create, and sell devices that generate much lower levels of EMF.
No one wants to return to the dark ages, but there are smarter and safer ways to approach our relationship—as individuals and across society—with the technology that exposes us to electromagnetic radiation.
Feeding America, the US national network of food banks, released its annual report on local food insecurity Thursday, showing that one in six Americans, including one in five children, did not have enough to eat at some point in 2012.
The report found that there are dozens of counties where more than a third of children do not get enough to eat. The incidence of hunger has grown dramatically. The percentage of households that are “food insecure” rose from 11.1 percent in 2007 to 16.0 percent in 2012.
Food insecurity is more widespread in the United States than in any other major developed country. According to separate data from the Organization for Economic Cooperation and Development (OECD), the rate of food insecurity in the US is nearly twice that of the European Union.
The growth of food insecurity has paralleled the growth of extreme poverty. The number of American households that live on less than $2 a day per person more than doubled between 1996 and 2011, from 636,000 to 1.46 million. There are now nearly 3 million children who live in households that earn less than $2 per day. Fresh milk and dairy products are a luxury, as are fruit and vegetables. With an average food stamp allotment of $1.40 per person, per meal, it is not possible to buy the types of food required to maintain “an active, healthy lifestyle.”
Widespread hunger exists alongside the most shameless displays of wealth. Just last week, Copper Beech Farm, a palatial 50-acre estate just outside New York City, sold for $120 million, setting a new record for the most expensive home sale in history. The ultra-luxury car market is also booming. Luxury carmaker Bentley, which recently introduced a revamped quarter-million-dollar, twelve-cylinder coupe, said sales were up by 17 percent last year.
Christie’s, the art auction house, sold $7.1 billion worth of art last year, a 16 percent increase from the year before and the highest on record. This included the $142 million sale of Francis Bacon’s “Three Studies of Lucian Freud,” the most expensive art sale on record, to casino mogul Stephen A. Wynn.
According to Feeding America, food-insecure people reported needing an average of $2.26 per person, per day to have enough food. On that basis, ensuring that all 16 million hungry children in the US had enough to eat would cost just $13 billion a year. There are 80 billionaires in the United States whose wealth, as individuals, exceeds this amount.
In an earlier period of American history, these levels of poverty and hunger amidst opulence were seen as a national disgrace. Michael Harrington’s 1962 exposure of poverty in Appalachia and elsewhere, The Other America, moved sections of the political establishment of the day to support such reforms as Medicare, Medicaid and Food Stamps. There is no significant tendency in the political or media establishment today that identifies itself as “liberal” and supports genuine social reform.
The prevalence of hunger is “higher than at any time since the Great Depression,” Feeding America told the WSWS in an interview earlier this week. Yet the spread of hunger is barely noted by politicians of either party or by the media.
The responsibility for this social disaster lies with the capitalist system and its political defenders. Administrations, both Democratic and Republican, have starved antipoverty programs of funds for decades. The Obama administration and Congress have overseen two successive food stamp cuts over the past six months: first in November, when benefits were slashed across the board by $36 per month for a family of four, and again in January of this year, when benefits were cut by an average of $90 per month for nearly a million households.
In between these two cuts, the White House and Congress allowed federal extended jobless benefits to lapse for some three million people, together with their two million dependent children. These cruel and inhuman actions come from an administration that has transferred trillions of dollars to Wall Street while refusing to prosecute the financial criminals responsible for the 2008 crash.
Obama’s signature social initiative, Obamacare, is already being exposed as a scheme to reduce the quantity and quality of health care available to ordinary working people while increasing their out-of-pocket costs. Government spending on health care will be slashed, Medicare gutted, and corporate outlays reduced, boosting the profits of the insurance monopolies, hospital chains and pharmaceutical companies.
Then there is the corporate-controlled media, which treats the roaring stock market and lavish displays of wealth by the rich with either open or thinly veiled enthusiasm, while barely acknowledging the existence of poverty. Major networks spend just 0.2 percent of their airtime covering issues relating to poverty, according to data from the Pew Research Center’s Project for Excellence in Journalism.
As for the auxiliary organizations of the corporations and the government, the trade unions, they too have facilitated the attack on working people. As the ruling class has carried out an unprecedented redistribution of wealth from the bottom to the top, the unions have focused their efforts on suppressing working-class opposition and preventing a political break with the Democratic Party.
What is emerging is the true, brutal face of capitalism, a system that piles up vast wealth at one pole of society and ever-greater poverty and wretchedness at the other.
This is true not only in the United States, but internationally. To satisfy the dictates of the banks, brutal austerity measures are being imposed around the world. The 85 richest individuals in the world now possess more wealth than the bottom half of the global population—some 3.5 billion people.
The means exist to provide all people with the necessities for a comfortable and fulfilling life—healthy food, decent housing, health care, education, access to culture and recreation. But the capitalist system, and the ruling class that sits atop it, make any rational control of production and distribution impossible. This system must be done away with and replaced with socialism—the rational planning of society under the democratic control of the working class to meet social needs, not the drive of a financial aristocracy for personal wealth and corporate profit.