7 facts that show the American dream is dead

A living wage, retirement security and a life free of debt are now only accessible to the country’s wealthiest

, AlterNet

7 facts that show the American dream is dead
This article originally appeared on AlterNet.

AlterNetrecent poll showed that more than half of all people in this country don’t believe that the American dream is real. Fifty-nine percent of those polled in June agreed that “the American dream has become impossible for most people to achieve.” More and more Americans believe there is “not much opportunity” to get ahead.

The public has reached this conclusion for a very simple reason: It’s true. The key elements of the American dream—a living wage, retirement security, the opportunity for one’s children to get ahead in life—are now unreachable for all but the wealthiest among us. And it’s getting worse. As inequality increases, the fundamental elements of the American dream are becoming increasingly unaffordable for the majority.

Here are seven ways the American dream is dying.

1. Most people can’t get ahead financially.

If the American dream means a reasonable rate of income growth for working people, most people can’t expect to achieve it.

As Ben Casselman observes at fivethirtyeight.com, the middle class hasn’t seen its wage rise in 15 years. In fact, the percentage of middle-class households in this nation is actually falling. Median household income has fallen since the financial crisis of 2008, while income for the wealthiest of Americans has actually risen.

Thomas Edsall wrote in the New York Timesthat “Not only has the wealth of the very rich doubled since 2000, but corporate revenues are at record levels.” Edsall also observed that, “In 2013, according to Goldman Sachs, corporate profits rose five times faster than wages.”

2. The stay-at-home parent is a thing of the past.

There was a time when middle-class families could lead a comfortable lifestyle on one person’s earnings. One parent could work while the other stayed home with the kids.



Those days are gone. As Elizabeth Warren and co-author Amelia Warren Tyagi documented in their 2003 book, The Two-Income Trap, the increasing number of two-earner families was matched by rising costs in a number of areas such as education, home costs and transportation.

These cost increases, combined with wage stagnation, mean that families are struggling to make ends meet—and that neither parent has the luxury of staying home any longer. In fact, parenthood has become a financial risk. Warren and Tyagi write that “Having a child is now the single best predictor that a woman will end up in financial collapse.” This book was written over a decade ago; things are even worse today.

3. The rich are more debt-free. Others have no choice.

Most Americans are falling behind anyway, as their salary fails to keep up with their expenses. No wonder debt is on the rise. As Joshua Freedman and Sherle R. Schwenninger observe in a paper for the New America Foundation, “American households… have become dependent on debt to maintain their standard of living in the face of stagnant wages.”

This “debt-dependent economy,” as Freedman and Schwenninger call it, has negative implications for the nation as a whole. But individual families are suffering too.

Rani Molla of the Wall Street Journal notes that “Over the past 20 years the average increase in spending on some items has exceeded the growth of incomes. The gap is especially poignant for those under 25 years old.”

There are increasingly two classes of Americans: Those who are taking on additional debt, and the rich.

4. Student debt is crushing a generation of non-wealthy Americans.

Education for every American who wants to get ahead? Forget about it. Nowadays you have to be rich to get a college education; that is, unless you want to begin your career with a mountain of debt. Once you get out of college, you’ll quickly discover that the gap between spending and income is greatest for people under 25 years of age.

Education, as Forbescolumnist Steve Odland put it in 2012, is “the great equalizer… the facilitator of the American dream.” But at that point college costs had risen 500 percent since 1985, while the overall consumer price index rose by 115 percent. As of 2013, tuition at a private university was projected to cost nearly $130,000 on average over four years, and that’s not counting food, lodging, books, or other expenses.

Public colleges and universities have long been viewed as the get-ahead option for all Americans, including the poorest among us. Not anymore. The University of California was once considered a national model for free, high-quality public education, but today tuition at UC Berkeley is $12,972 per year. (It was tuition-free until Ronald Reagan became governor.) Room and board is $14,414. The total cost of on-campus attendance at Berkeley, including books and other items, is estimated to be $32,168.

The California story has been repeated across the country, as state cutbacks in the wake of the financial crisis caused the cost of public higher education to soar by 15 percent in a two-year period. With a median national household income of $51,000, even public colleges are quickly becoming unaffordable

Sure, there are still some scholarships and grants available. But even as college costs rise, the availability of those programs is falling, leaving middle-class and lower-income students further in debt as out-of-pocket costs rise.

5. Vacations aren’t for the likes of you anymore.

Think you’d like to have a nice vacation? Think again. According to a 2012 American Express survey, Americans who were planning vacations expected to spend an average of $1,180 per person. That’s $4,720 for a family of four. But then, why worry about paying for that vacation? If you’re unemployed, you can’t afford it. And even if you have a job, there’s a good chance you won’t get the time off anyway.

As the Center for Economic and Policy Research found in 2013, the United States is the only advanced economy in the world that does not require employers to offer paid vacations to their workers. The number of paid holidays and vacation days received by the average worker in this country (16) would not meet the statutory minimum requirements in 19 other developed countries, according to the CEPR. Thirty-one percent of workers in smaller businesses had no paid vacation days at all.

The CEPR also found that 14 percent of employees at larger corporations also received no paid vacation days. Overall, roughly one in four working Americans gets no vacation time at all.

Rep. Alan Grayson, who has introduced the Paid Vacation Act, correctly notes that the average working American now spends 176 hours more per year on the job than was the case in 1976.

Between the pressure to work more hours and the cost of vacation, even people who do get vacation time—at least on paper—are hard-pressed to take any time off. That’s why 175 million vacation days go unclaimed each year.

6. Even with health insurance, medical care is increasingly unaffordable for most people.

Medical care when you need it? That’s for the wealthy.

The Affordable Care Act was designed to increase the number of Americans who are covered by health insurance. But health coverage in this country is the worst of any highly developed nation—and that’s for people who have health insurance.

Every year the Milliman actuarial firm analyzes the average costs of medical care, including the household’s share of insurance premiums and out-of-pocket costs, for a family of four with the kind of insurance that is considered higher quality coverage in this country: a PPO plan which allows them to use a wider range of healthcare providers.

Even as overall wealth in this country has shifted upward, away from middle-class families, the cost of medical care is increasingly being borne by the families themselves. As the Milliman study shows, the employer-funded portion of healthcare costs has risen 52 percent since 2007, the first year of the recession. But household costs have risen by a staggering 73 percent, or 8 percent per year, and now average $9,144. In the same time period, Census Bureau figures show that median household income has fallen 8 percent.

That means that household healthcare costs are skyrocketing even as income falls dramatically.

The recent claims of “lowered healthcare costs” are misleading. While the rate of increase is slowing down, healthcare costs are continuing to increase. And the actual cost to working Americans is increasing even faster, as corporations continue to maximize their record profits by shifting healthcare costs onto consumers. This shift is expected to accelerate as the result of a misguided provision in the Affordable Care Act which will tax higher-cost plans.

According to an OECD survey, the number of Americans who report going without needed healthcare in the past year because of cost was higher than in 10 comparable countries. This was true for both lower-income and higher-income Americans, suggesting that insured Americans are also feeling the pinch when it comes to getting medical treatment.

As inequality worsens, wages continue to stagnate, and more healthcare costs are placed on the backs of working families, more and more Americans will find medical care unaffordable.

7. Americans can no longer look forward to a secure retirement.

Want to retire when you get older, as earlier generations did, and enjoy a secure life after a lifetime of hard work? You’ll get to… if you’re rich.

There was a time when most middle-class Americans could work until they were 65 and then look forward to a financially secure retirement. Corporate pensions guaranteed a minimum income for the remainder of their life. Those pensions, coupled with Social Security income and a lifetime’s savings, assured that these ordinary Americans could spend their senior years in modest comfort.

No longer. As we have already seen, rising expenses means most Americans are buried in debt rather than able to accumulate modest savings. That’s the main reason why 20 percent of Americans who are nearing retirement age haven’t saved for their post-working years.

Meanwhile, corporations are gutting these pension plans in favor of far less general programs. The financial crisis of 2008, driven by the greed of Wall Street one percenters, robbed most American household of their primary assets. And right-wing “centrists” of both parties, not satisfied with the rising retirement age which has already cut the program’s benefits, continue to press for even deeper cuts to the program.

One group, Natixis Global Asset Management, ranks the United States 19th among developed countries when it comes to retirement security. The principal reasons the US ranks so poorly are 1) the weakness of our pension programs; and 2) the stinginess of our healthcare system, which even with Medicare for the elderly, is far weaker than that of nations such as Austria.

Economists used to speak of retirement security as a three-legged stool. Pensions were one leg of the stool, savings were another and Social Security was the third. Today two legs of the stool have been shattered, and anti-Social Security advocates are sawing away at the third.

Conclusion

Vacations; an education; staying home to raise your kids; a life without crushing debt; seeing the doctor when you don’t feel well; a chance to retire: one by one, these mainstays of middle-class life are disappearing for most Americans. Until we demand political leadership that will do something about it, they’re not coming back.

Can the American dream be restored? Yes, but it will take concerted effort to address two underlying problems. First, we must end the domination of our electoral process by wealthy and powerful elites. At the same time, we must begin to address the problem of growing economic inequality. Without a national movement to call for change, change simply isn’t going to happen.

Richard (RJ) Eskow is a writer and policy analyst. He is a Senior Fellow with the Campaign for America’s Future and is host and managing editor of The Zero Hour on We Act Radio.

 

http://www.salon.com/2014/10/25/7_facts_that_show_the_american_dream_is_dead_partner/?source=newsletter

The Impulse Society

How Our Growing Desperation for Instant Connection Is Ruining Us

Consumer culture does everything in its power to persuade us that adversity has no place in our lives.

The following is an excerpt from Paul Roberts’ new book, The Impulse Society: America in the Age of Instant Gratification (Bloomsbury, 2014). Reprinted here with permission.

The metaphor of the expanding fragile modern self is quite apt. To personalize is, in effect, to reject the world “as is,” and instead to insist on bending it to our preferences, as if mastery and dominance were our only mode. But humans aren’t meant only for mastery. We’re also meant to adapt to something larger. Our large brains are specialized for cooperation and compromise and negotiation—with other individuals, but also with the broader world, which, for most of history, did not cater to our preferences or likes. For all our ancestors’ tremendous skills at modifying and improving their environment, daily survival depended as much on their capacity to conform themselves and their expectations to the world as they found it. Indeed, it was only by enduring adversity and disappointment that we humans gained the strength and knowledge and perspective that are essential to sustainable mastery.

Virtually every traditional culture understood this and regarded adversity as inseparable from, and essential to, the formation of strong, self-sufficient individuals. Yet the modern conception of “character” now leaves little space for discomfort or real adversity. To the contrary, under the Impulse Society, consumer culture does everything in its considerable power to persuade us that adversity and difficulty and even awkwardness have no place in our lives (or belong only in discrete, self-enhancing moments, such as ropes courses or really hard ab workouts). Discomfort, difficulty, anxiety, suffering, depression, rejection, uncertainty, or ambiguity—in the Impulse Society, these aren’t opportunities to mature and toughen or become. Rather, they represent errors and inefficiencies, and thus opportunities to correct—nearly always with more consumption and self-expression.

So rather than having to wait a few days for a package, we have it overnighted. Or we pay for same-day service. Or we pine for the moment when Amazon launches drone delivery and can get us our package in thirty minutes.* And as the system gets faster at gratifying our desires, the possibility that we might actually be more satisfied by waiting and enduring a delay never arises. Just as nature abhors a vacuum, the efficient consumer market abhors delay and adversity, and by extension, it cannot abide the strength of character that delay and adversity and inefficiency generally might produce. To the efficient market, “character” and “virtue” are themselves inefficiencies—impediments to the volume-based, share-price-maximizing economy. Once some new increment of self-expressive, self-gratifying, self-promoting capability is made available, the unstated but overriding assumption of contemporary consumer culture is that this capability can and should be put to use. Which means we now allow the efficient market and the treadmills and the relentless cycles of capital and innovation to determine how, and how far, we will take our self-expression and, by extension, our selves— even when doing so leaves us in a weaker state.

Consider the way our social relationships, and the larger processes of community, are changing under the relentless pressure of our new efficiencies. We know how important community is for individual development. It’s in the context of community that we absorb the social rules and prerequisites for interaction and success. It’s here that we come to understand and, ideally, to internalize, the need for limits and self-control, for patience and persistence and long-term commitments; the pressure of community is one way society persuades us to control our myopia and selfishness. (Or as economists Sam Bowles and Herbert Gintis have put it, community is the vehicle through which “society’s ‘oughts’ become its members’ ‘wants.’ ”) But community’s function isn’t simply to say “no.” It’s in the context of our social relationships where we discover our capacities and strengths. It’s here that we gain our sense of worth as individuals, as citizens and as social producers—active participants who don’t merely consume social goods, but contribute something the community needs.

But community doesn’t simply teach us to be productive citizens. People with strong social connections generally have a much better time. We enjoy better physical and mental health, recover faster from sickness or injury, and are less likely to suffer eating or sleeping disorders. We report being happier and rank our quality of life as higher—and do so even when the community that we’re connected to isn’t particularly well off or educated. Indeed, social connectedness is actually more important than affluence: regular social activities such as volunteering, church attendance, entertaining friends, or joining a club provide us with the same boost to happiness as does a doubling of personal income. As Harvard’s Robert Putnam notes, “The single most common finding from a half century’s research on the correlates of life satisfaction, not only in the United States but around the world, is that happiness is best predicted by the breadth and depth of one’s social connections.”

Unfortunately, for all the importance of social connectedness, we haven’t done a terribly good job of preserving it under the Impulse Society. Under the steady pressure of commercial and technological efficiencies, many of the tight social structures of the past have been eliminated or replaced with entirely new social arrangements. True, many of these new arrangements are clearly superior—even in ostensibly free societies, traditional communities left little room for individual growth or experimentation or happiness. Yet our new arrangements, which invariably seek to give individuals substantially more control over how they connect, exact a price. More and more, social connection becomes just another form of consumption, one we expect to tailor to our personal preferences and schedules—almost as if community was no longer a necessity or an obligation, but a matter of personal style, something to engage as it suits our mood or preference. And while such freedom has its obvious attractions, it clearly has downsides. In gaining so much control over the process of social connection, we may be depriving ourselves of some of the robust give-and-take of traditional interaction that is essential to becoming a functional, fulfilled individual.

Consider our vaunted and increasing capacity to communicate and connect digitally. In theory, our smartphones and social media allow us the opportunity to be more social than at any time in history. And yet, because there are few natural limits to this format—we can, in effect, communicate incessantly, posting every conceivable life event, expressing every thought no matter how incompletely formed or inappropriate or mundane—we may be diluting the value of the connection.

Studies suggest, for example, that the efficiency with which we can respond to an online provocation routinely leads to escalations that can destroy offline relationships. “People seem aware that these kinds of crucial conversations should not take place on social media,” notes Joseph Grenny, whose firm, VitalSmarts, surveys online behavior. “Yet there seems to be a compulsion to resolve emotions right now and via the convenience of these channels.”

Even when our online communications are entirely friendly, the ease with which we can reach out often undermines the very connection we seek to create. Sherry Turkle, a sociologist and clinical psychologist who has spent decades researching digital interactions, argues that because it is now possible to be in virtually constant contact with others, we tend to communicate so excessively that even a momentary lapse can leave us feeling isolated or abandoned. Where people in the pre-digital age did not think it alarming to go hours or days or even weeks without hearing from someone, the digital mind can become uncomfortable and anxious without instant feedback. In her book Alone Together, Turkle describes a social world of collapsing time horizons. College students text their parents daily, and even hourly, over the smallest matters—and feel anxious if they can’t get a quick response. Lovers break up over the failure to reply instantly to a text; friendships sour when posts aren’t “liked” fast enough. Parents call 911 if Junior doesn’t respond immediately to a text or a phone call—a degree of panic that was simply unknown before constant digital contact. Here, too, is a world made increasingly insecure by its own capabilities and its own accelerating efficiencies.

This same efficiency-driven insecurity now lurks just below the surface in nearly all digital interactions. Whatever the relationship (romantic, familial, professional), the very nature of our technology inclines us to a constant state of emotional suspense. Thanks to the casual, abbreviated nature of digital communication, we converse in fragments of thoughts and feelings that can be completed only through more interaction—we are always waiting to know how the story ends. The result, Turkle says, is a communication style, and a relationship style, that allow us to “express emotions while they are being formed” and in which “feelings are not fully experienced until they are communicated.” In other words, what was once primarily an interior process—thoughts were formed and feelings experienced before we expressed them—has now become a process that is external and iterative and public. Identity itself comes to depend on iterative interaction—giving rise to what Turkle calls the “collaborative self.” Meanwhile, our skills as a private, self-contained person vanish. “What is not being cultivated here,” Turkle writes, “is the ability to be alone and reflect on one’s emotions in private.” For all the emphasis on independence and individual freedom under the Impulse Society, we may be losing the capacity to truly be on our own.

In a culture obsessed with individual self-interest, such an incapacity is surely one of the greatest ironies of the Impulse Society. Yet it many ways it was inevitable. Herded along by a consumer culture that is both solicitous and manipulative, one that proposes absolute individual liberty while enforcing absolute material dependence—we rely completely on the machine of the marketplace—it is all too easy to emerge with a self-image, and a sense of self, that are both wildly inflated and fundamentally weak and insecure. Unable to fully experience the satisfactions of genuine independence and individuality, we compensate with more personalized self-expression and gratification, which only push us further from the real relationships that might have helped us to a stable, fulfilling existence.

 

http://www.alternet.org/books/impulse-society-how-our-growing-desperation-instant-connection-ruining-us?akid=12390.265072.bjTHr8&rd=1&src=newsletter1024073&t=9&paging=off&current_page=1#bookmark

 

Census report: Half of Americans poor or near poor

http://previous.presstv.ir/photo/20111118/yasaman.hashemi20111118065547607.jpg

By Andre Damon
22 October 2014

Forty-seven percent of Americans have incomes under twice the official poverty rate, making half of the country either poor or near-poor, according to figures released last week by the Census Bureau.

These figures are based on the Census Bureau’s Supplemental Poverty Measure (SPM), which takes into account government transfers and the regional cost-of-living in calculating the poverty rate. According to that calculation, there were 48.7 million people in poverty in the United States, three million higher than the official census figures released last month. The US poverty rate, according to the SPM, was 15.5 percent.

Data from the Census Bureau report

The release of these figures, as well as last month’s official poverty figures, have been greeted with silence in the media, despite the fact that the US is a mere two weeks away from a midterm election. As with every major social and political question, the issues of poverty and social inequality are being totally excluded from debate and discussion in the elections and ignored by the two big business parties.

The figures follow the release of a series of reports and studies documenting the growth of social inequality in the United States. Last week, Credit Suisse reported that the top one percent of the world’s population controls nearly half of all wealth, and that the United States has nearly ten times more super-wealthy people than any other country.

The census figures “show that poverty is still a major problem in the US,” said Christopher Wimer, Co-Director of the Center on Poverty and Social Policy at Columbia University, in a telephone interview Tuesday.

He said the SPM begins with a slightly higher poverty threshold then the official poverty figure, and then adjusts it based on the local cost of living and the prices of necessities of life.

As a result, both the poverty rate and the number of people in poverty are slightly higher than under the official poverty figure. But the biggest difference is that the more sophisticated supplemental measure shows the extent to which a much broader section of the population is struggling to make ends meet. “Because the supplemental poverty measure subtracts non-discretionary income, you get a lot more people hovering close to the poverty line,” Dr. Wimer said.

The official poverty threshold is calculated as “three times the cost of a minimum food diet in 1963,” adjusted for inflation. By that calculation, the poverty threshold of an adult living alone is $11,888, and an adult with two children is $18,769, both of which are absurdly low.

Since the SPM takes into account regional differences in the cost of living, it better reflects the true prevalence of poverty in high cost-of-living states, such as California, and cities, such as New York City.

“The supplemental poverty measure reflects the fact that the cost of living is much higher in many major metropolitan areas,” Dr. Wimer said. “Those areas also tend to have higher population densities, so that ends up affecting a lot of people.”

Based on the latest Census numbers, nearly one in four people in California lives below the poverty line. Using the supplemental measure, California has a poverty rate of 23.4 percent, compared with the state’s official poverty figures of about 16 percent.

Dr. Wimer said he and fellow researchers at Columbia University have followed a methodology similar to that used by the Census Bureau’s Supplemental Poverty Measure to study economic hardship among a representative sample of New York City residents.

They found that nearly a quarter of the city’s residents were in poverty—23 percent, compared to the official poverty rate of about 21 percent. Fifty-five percent of New York City residents had an income of below twice the poverty line.

Thirty-seven percent of New York City residents were affected by what the survey called “severe material hardships,” including “staying at a shelter, moving in with others or having utilities shut off.” The report added, “If we consider the number of New Yorkers who suffer moderate, if not truly severe, material adversity, the number climbs to 6 in 10 New Yorkers.”

Based on these findings, the report concluded, “Nearly two-thirds of New York City residents struggled to make ends meet at some point during 2012.”

The wealth of the super-rich, meanwhile, continues to soar, with the net worth of the Forbes 400 richest people in the United States surging 13 percent last year. Fifty-two members of the Forbes 400 resided in New York City, more than twice the number living in any other city.

Dr. Wimer noted that the Census SPM report shows the role played by government anti-poverty programs in keeping large sections of the population out of destitution. To the extent that there has been a decline in poverty in recent decades, “it is not driven by market income; the reduction has been coming from government policies and programs such as food stamps and unemployment insurance,” he said.

According to the Census SPM report, food stamps kept two percent of the population out of poverty in 2012, while unemployment insurance kept about one percent of the population out of poverty. The census figures reflect cutbacks in both of these programs in 2013.

With the expiration of federal extended unemployment benefits for the long-term unemployed at the end of last year, together with additional cutbacks to food stamps, the number of people affected by cuts to these vital anti-poverty programs will only increase.

Cuts to these programs have been implemented and supported by both the Democrats and Republicans. The Obama administration’s 2015 budget proposal, for example, calls for slashing the budget of the Department of Health and Human Services, which funds the Head Start preschool program, and the Department of Agriculture, which administers the food stamp program, by more than five percent.

 

http://www.wsws.org/en/articles/2014/10/22/pove-o22.html

Who Gives More of Their Money to Charity?

People Who Make More or Less Than $200k a Year?

Philanthropy and income have an inverse relationship.

Billionaire CEO Nicholas Woodman, news reports trumpeted earlier this month, has set aside $450 million worth of his GoPro software stock to set up a brand-new charitable foundation.

“We wake up every morning grateful for the opportunities life has given us,” Woodman and his wife Jill noted in a joint statement. “We hope to return the favor as best we can.”

Stories about charitable billionaires have long been a media staple. The defenders of our economic order love them — and regularly trot them out to justify America’s ever more top-heavy concentration of income and wealth.

Our charities depend, the argument goes, on the generosity of the rich. The richer the rich, the better off our charitable enterprises will be.

But this defense of inequality, analysts have understood for quite some time, holds precious little water. Low- and middle-income people, the research shows, give a greater share of their incomes to charity than people of decidedly more ample means.

The Chronicle of Philanthropy, the nation’s top monitor of everything charitable, last week dramatically added to this research.

Between 2006 and 2012, a new Chronicle analysis of IRS tax return data reveals, Americans who make over $200,000 a year decreased the share of their income they devote to charity by 4.6 percent.

Over those same years, a time of recession and limited recovery, these same affluent Americans saw their own incomes increase. For the nation’s top 5 percent of income earners, that increase averaged 9.9 percent.

By contrast, those Americans making less than $100,000 actually increased their giving between 2006 and 2012. The most generous Americans of all? Those making less than $25,000. Amid the hard times of recent years, low-income Americans devoted 16.6 percent more of their meager incomes to charity.

Overall, those making under $100,000 increased their giving by 4.5 percent.

In the half-dozen years this new study covers, the Chronicle of Philanthropy concludes, “poor and middle class Americans dug deeper into their wallets to give to charity, even though they were earning less.”

America’s affluent do still remain, in absolute terms, the nation’s largest givers to charity. In 2012, the Chronicle analysis shows, those earning under $100,000 handed charities $57.3 billion. Americans making over $200,000 gave away $77.5 billion.

But that $77.5 billion pales against at how much more the rich could — rather painlessly — be giving. Between 2006 and 2012, the combined wealth of the Forbes 400 alone increased by $1.04 trillion.

What the rich do give to charity often does people truly in need no good at all. Wealthy people do the bulk of their giving to colleges and cultural institutions, notes Chronicle of Philanthropy editor Stacy Palmer. Food banks and other social service charities “depend more on lower income Americans.”

Low- and middle-income people, adds Palmer, “know people who lost their jobs or are homeless.” They’ve been sacrificing “to help their neighbors.”

America’s increasing economic segregation, meanwhile, has left America’s rich less and less exposed to “neighbors” struggling to get by. That’s opening up, says Vox policy analyst Danielle Kurtzleben, an “empathy gap.”

“After all,” she explains, “if I can’t see you, I’m less likely to help you.”

The more wealth concentrates, the more nonprofits chase after these less-than-empathetic rich for donations. The priorities of these rich, notes Kurtzleben, become the priorities for more and more nonprofits.

The end result? Elite universities get mega-million-dollar donations to build mahogany-appointed students dorms. Art museums get new wings. Hospitals get windfalls to tackle the diseases that spook the high-end set.

Some in that set do seem to sense the growing disconnect between real need and real resources. Last week billionaire hedge fund manager David Einhorn announced a $50 million gift to help Cornell University set students up in “real-world experiences” that address the challenges hard-pressed communities face.

“When you go out beyond the classroom and into the community and find problems and have to deal with people in the real world,” says Einhorn, “you develop skills for empathy.”

True enough — but in a society growing ever more unequal and separate, not enough. In that society — our society — the privileged will continue to go “blind to how people outside their own class are living,” as Danielle Kurtzleben puts it.

We need, in short, much more than Empathy 101. We need more equality.

Labor journalist Sam Pizzigati, an Institute for Policy Studies associate fellow, writes widely about inequality. His latest book is “The Rich Don’t Always Win: The Forgotten Triumph over Plutocracy that Created the American Middle Class, 1900-1970.”

 

http://www.alternet.org/economy/guess-who-gives-more-their-money-charity-people-who-make-more-or-less-200k-year?akid=12386.265072.PjWDq0&rd=1&src=newsletter1023920&t=15&paging=off&current_page=1#bookmark

Obama Is a Republican

He’s the heir to Richard Nixon, not Saul Alinsky.

illustration by Michael Hogue

illustration by Michael Hogue

Back in 2008, Boston University professor Andrew Bacevich wrote an article for this magazine making a conservative case for Barack Obama. While much of it was based on disgust with the warmongering and budgetary profligacy of the Republican Party under George W. Bush, which he expected to continue under 2008 Republican nominee Sen. John McCain, Bacevich thought Obama at least represented hope for ending the Iraq War and shrinking the national-security state.

I wrote a piece for the New Republic soon afterward about the Obamacon phenomenon—prominent conservatives and Republicans who were openly supporting Obama. Many saw in him a classic conservative temperament: someone who avoided lofty rhetoric, an ambitious agenda, and a Utopian vision that would conflict with human nature, real-world barriers to radical reform, and the American system of government.

Among the Obamacons were Ken Duberstein, Ronald Reagan’s chief of staff; Charles Fried, Reagan’s solicitor general; Ken Adelman, director of the Arms Control and Disarmament Agency for Reagan; Jeffrey Hart, longtime senior editor of National Review; Colin Powell, Reagan’s national security adviser and secretary of state for George W. Bush; and Scott McClellan, Bush’s press secretary. There were many others as well.

According to exit polls in 2008, Obama ended up with 20 percent of the conservative vote. Even in 2012, after four years of relentless conservative attacks, he still got 17 percent of the conservative vote, with 11 percent of Tea Party supporters saying they cast their ballots for Obama.

They were not wrong. In my opinion, Obama has governed as a moderate conservative—essentially as what used to be called a liberal Republican before all such people disappeared from the GOP. He has been conservative to exactly the same degree that Richard Nixon basically governed as a moderate liberal, something no conservative would deny today. (Ultra-leftist Noam Chomsky recently called Nixon “the last liberal president.”)

Here’s the proof:

Iraq/Afghanistan/ISIS

One of Obama’s first decisions after the election was to keep national-security policy essentially on automatic pilot from the Bush administration. He signaled this by announcing on November 25, 2008, that he planned to keep Robert M. Gates on as secretary of defense. Arguably, Gates had more to do with determining Republican policy on foreign and defense policy between the two Bush presidents than any other individual, serving successively as deputy national security adviser in the White House, director of Central Intelligence, and secretary of defense.

Another early indication of Obama’s hawkishness was naming his rival for the Democratic nomination, Sen. Hillary Clinton, as secretary of state. During the campaign, Clinton ran well to his right on foreign policy, so much so that she earned the grudging endorsement of prominent neoconservatives such as Bill Kristol and David Brooks.

Obama, Kristol told the Washington Post in August 2007, “is becoming the antiwar candidate, and Hillary Clinton is becoming the responsible Democrat who could become commander in chief in a post-9/11 world.” Writing in the New York Times on February 5, 2008, Brooks praised Clinton for hanging tough on Iraq “through the dark days of 2005.”

Right-wing columnist Ann Coulter found Clinton more acceptable on national-security policy than even the eventual Republican nominee, Senator McCain. Clinton, Coulter told Fox’s Sean Hannity on January 31, 2008, was “more conservative than he [McCain] is. I think she would be stronger in the war on terrorism.” Coulter even said she would campaign for Clinton over McCain in a general election match up.

After Obama named Clinton secretary of state, there was “a deep sigh” of relief among Republicans throughout Washington, according to reporting by The Daily Beast’s John Batchelor. He noted that not a single Republican voiced any public criticism of her appointment.

By 2011, Republicans were so enamored with Clinton’s support for their policies that Dick Cheney even suggested publicly that she run against Obama in 2012. The irony is that as secretary of state, Clinton was generally well to Obama’s left, according to Vali Nasr’s book The Dispensable Nation. This may simply reflect her assumption of state’s historical role as the dovish voice in every administration. Or it could mean that Obama is far more hawkish than conservatives have given him credit for.

Although Obama followed through on George W. Bush’s commitment to pull U.S. troops out of Iraq in 2011, in 2014 he announced a new campaign against ISIS, an Islamic militant group based in Syria and Iraq.

Stimulus/Deficit

With the economy collapsing, the first major issue confronting Obama in 2009 was some sort of economic stimulus. Christina Romer, chair of the Council of Economic Advisers, whose academic work at the University of California, Berkeley, frequently focused on the Great Depression, estimated that the stimulus needed to be in the range of $1.8 trillion, according to Noam Scheiber’s book The Escape Artists.

The American Recovery and Reinvestment Act was enacted in February 2009 with a gross cost of $816 billion. Although this legislation was passed without a single Republican vote, it is foolish to assume that the election of McCain would have resulted in savings of $816 billion. There is no doubt that he would have put forward a stimulus plan of roughly the same order of magnitude, but tilted more toward Republican priorities.

A Republican stimulus would undoubtedly have had more tax cuts and less spending, even though every serious study has shown that tax cuts are the least effective method of economic stimulus in a recession. Even so, tax cuts made up 35 percent of the budgetary cost of the stimulus bill—$291 billion—despite an estimate from Obama’s Council of Economic Advisers that tax cuts barely raised the gross domestic product $1 for every $1 of tax cut. By contrast, $1 of government purchases raised GDP $1.55 for every $1 spent. Obama also extended the Bush tax cuts for two years in 2010.

It’s worth remembering as well that Bush did not exactly bequeath Obama a good fiscal hand. Fiscal year 2009 began on October 1, 2008, and one third of it was baked in the cake the day Obama took the oath of office. On January 7, 2009, the Congressional Budget Office projected significant deficits without considering any Obama initiatives. It estimated a deficit of $1.186 trillion for 2009 with no change in policy. The Office of Management and Budget estimated in November of that year that Bush-era policies, such as Medicare Part D, were responsible for more than half of projected deficits over the next decade.

Republicans give no credit to Obama for the significant deficit reduction that has occurred on his watch—just as they ignore the fact that Bush inherited an projected budget surplus of $5.6 trillion over the following decade, which he turned into an actual deficit of $6.1 trillion, according to a CBO study—but the improvement is real.

Screenshot 2014-10-20 12.59.16

 

 

 

 

 

 

 

 

 

Republicans would have us believe that their tight-fisted approach to spending is what brought down the deficit. But in fact, Obama has been very conservative, fiscally, since day one, to the consternation of his own party. According to reporting by the Washington Post and New York Times, Obama actually endorsed much deeper cuts in spending and the deficit than did the Republicans during the 2011 budget negotiations, but Republicans walked away.

Obama’s economic conservatism extends to monetary policy as well. His Federal Reserve appointments have all been moderate to conservative, well within the economic mainstream. He even reappointed Republican Ben Bernanke as chairman in 2009. Many liberals have faulted Obama for not appointing board members willing to be more aggressive in using monetary policy to stimulate the economy and reduce unemployment.

Obama’s other economic appointments, such as Larry Summers at the National Economic Council and Tim Geithner at Treasury, were also moderate to conservative. Summers served on the Council of Economic Advisers staff in Reagan’s White House. Geithner joined the Treasury during the Reagan administration and served throughout the George H.W. Bush administration.

Health Reform

Contrary to rants that Obama’s 2010 health reform, the Patient Protection and Affordable Care Act (ACA), is the most socialistic legislation in American history, the reality is that it is virtually textbook Republican health policy, with a pedigree from the Heritage Foundation and Massachusetts Gov. Mitt Romney, among others.

It’s important to remember that historically the left-Democratic approach to healthcare reform was always based on a fully government-run system such as Medicare or Medicaid. During debate on health reform in 2009, this approach was called “single payer,” with the government being the single payer. One benefit of this approach is cost control: the government could use its monopsony buying power to force down prices just as Walmart does with its suppliers.

Conservatives wanted to avoid too much government control and were adamantly opposed to single-payer. But they recognized that certain problems required more than a pure free-market solution. One problem in particular is covering people with pre-existing conditions, one of the most popular provisions in ACA. The difficulty is that people may wait until they get sick before buying insurance and then expect full coverage for their conditions. Obviously, this free-rider problem would bankrupt the health-insurance system unless there was a fix.

The conservative solution was the individual mandate—forcing people to buy private health insurance, with subsidies for the poor. This approach was first put forward by Heritage Foundation economist Stuart Butler in a 1989 paper, “A Framework for Reform,” published in a Heritage Foundation book, A National Health System for America. In it, Butler said the number one element of a conservative health system was this: “Every resident of the U.S. must, by law, be enrolled in an adequate health care plan to cover major health costs.” He went on to say:

Under this arrangement, all households would be required to protect themselves from major medical costs by purchasing health insurance or enrolling in a prepaid health plan. The degree of financial protection can be debated, but the principle of mandatory family protection is central to a universal health care system in America.

In 1991, prominent conservative health economist Mark V. Pauley also endorsed the individual mandate as central to healthcare reform. In an article in the journal Health Affairs, Pauley said:

All citizens should be required to obtain a basic level of health insurance. Not having health insurance imposes a risk of delaying medical care; it also may impose costs on others, because we as a society provide care to the uninsured. … Permitting individuals to remain uninsured results in inefficient use of medical care, inequity in the incidence of costs of uncompensated care, and tax-related distortions.

In 2004, Senate Majority Leader Bill Frist (R-Tenn.) endorsed an individual mandate in a speech to the National Press Club. “I believe higher-income Americans today do have a societal and personal responsibility to cover in some way themselves and their children,” he said. Even libertarian Ron Bailey, writing in Reason, conceded the necessity of a mandate in a November 2004 article titled, “Mandatory Health Insurance Now!” Said Bailey: “Why shouldn’t we require people who now get health care at the expense of the rest of us pay for their coverage themselves? … Mandatory health insurance would not be unlike the laws that require drivers to purchase auto insurance or pay into state-run risk pools.”

Among those enamored with the emerging conservative health reform based on an individual mandate was Mitt Romney, who was elected governor of Massachusetts in 2002. In 2004, he put forward a state health reform plan to which he later added an individual mandate. As Romney explained in June 2005, “No more ‘free riding,’ if you will, where an individual says: ‘I’m not going to pay, even though I can afford it. I’m not going to get insurance, even though I can afford it. I’m instead going to just show up and make the taxpayers pay for me’.”

The following month, Romney emphasized his point: “We can’t have as a nation 40 million people—or, in my state, half a million—saying, ‘I don’t have insurance, and if I get sick, I want someone else to pay’.”

In 2006, Governor Romney signed the Massachusetts health reform into law, including the individual mandate. Defending his legislation in a Wall Street Journal article, he said:

I proposed that everyone must either purchase a product of their choice or demonstrate that they can pay for their own health care. It’s a personal responsibility principle.

Some of my libertarian friends balk at what looks like an individual mandate. But remember, someone has to pay for the health care that must, by law, be provided: Either the individual pays or the taxpayers pay. A free ride on government is not libertarian.

As late as 2008, Robert Moffitt of the Heritage Foundation was still defending the individual mandate as reasonable, non-ideological and nonpartisan in an article for the Harvard Health Policy Reviewthisarticleappeared-novdec14

So what changed just a year later, when Obama put forward a health-reform plan that was almost a carbon copy of those previously endorsed by the Heritage Foundation, Mitt Romney, and other Republicans? The only thing is that it was now supported by a Democratic president that Republicans vowed to fight on every single issue, according to Robert Draper’s book Do Not Ask What Good We Do.

Senior Obama adviser David Axelrod later admitted that Romney’s Massachusetts plan was the “template” for Obama’s plan. “That work inspired our own health plan,” he said in 2011. But no one in the White House said so back in 2009. I once asked a senior Obama aide why. His answer was that once Republicans refused to negotiate on health reform and Obama had to win only with Democratic votes, it would have been counterproductive, politically, to point out the Obama plan’s Republican roots.

The left wing of the House Democratic caucus was dubious enough about Obama’s plan as it was, preferring a single-payer plan. Thus it was necessary for Obama to portray his plan as more liberal than it really was to get the Democratic votes needed for passage, which of course played right into the Republicans’ hands. But the reality is that ACA remains a very modest reform based on Republican and conservative ideas.

Other Rightward Policies 

Below are a few other issues on which Obama has consistently tilted rightward:

Drugs: Although it has become blindingly obvious that throwing people in jail for marijuana use is insane policy and a number of states have moved to decriminalize its use, Obama continued the harsh anti-drug policy of previous administrations, and his Department of Justice continues to treat marijuana as a dangerous drug. As Time put it in 2012: “The Obama Administration is cracking down on medical marijuana dispensaries and growers just as harshly as the Administration of George W. Bush did.”

National-security leaks: At least since Nixon, a hallmark of Republican administrations has been an obsession with leaks of unauthorized information, and pushing the envelope on government snooping. By all accounts, Obama’s penchant for secrecy and withholding information from the press is on a par with the worst Republican offenders. Journalist Dan Froomkin charges that Obama has essentially institutionalized George W. Bush’s policies. Nixon operative Roger Stone thinks Obama has actually gone beyond what his old boss tried to do.

Race: I think almost everyone, including me, thought the election of our first black president would lead to new efforts to improve the dismal economic condition of African-Americans. In fact, Obama has seldom touched on the issue of race, and when he has he has emphasized the conservative themes of responsibility and self-help. Even when Republicans have suppressed minority voting, in a grotesque campaign to fight nonexistent voter fraud, Obama has said and done nothing.

Gay marriage: Simply stating public support for gay marriage would seem to have been a no-brainer for Obama, but it took him two long years to speak out on the subject and only after being pressured to do so.

Corporate profits: Despite Republican harping about Obama being anti-business, corporate profits and the stock market have risen to record levels during his administration. Even those progressives who defend Obama against critics on the left concede that he has bent over backward to protect corporate profits. As Theda Skocpol and Lawrence Jacobs put it: “In practice, [Obama] helped Wall Street avert financial catastrophe and furthered measures to support businesses and cater to mainstream public opinion. …  He has always done so through specific policies that protect and further opportunities for businesses to make profits.”

I think Cornell West nailed it when he recently charged that Obama has never been a real progressive in the first place. “He posed as a progressive and turned out to be counterfeit,” West said. “We ended up with a Wall Street presidency, a drone presidency, a national security presidency.”

I don’t expect any conservatives to recognize the truth of Obama’s fundamental conservatism for at least a couple of decades—perhaps only after a real progressive presidency. In any case, today they are too invested in painting him as the devil incarnate in order to frighten grassroots Republicans into voting to keep Obama from confiscating all their guns, throwing them into FEMA re-education camps, and other nonsense that is believed by many Republicans. But just as they eventually came to appreciate Bill Clinton’s core conservatism, Republicans will someday see that Obama was no less conservative.

Bruce Bartlett is the author of The Benefit and the Burden: Tax Reform—Why We Need It and What It Will Take.

http://www.theamericanconservative.com/articles/obama-is-a-republican/

A “silent majority” of young people without college degrees and decent jobs are on a downwardly-mobile slide.


A Majority of Millennials Don’t Have a College Degree—That’s Going to Cost Everybody

Photo Credit: Shutterstock.com

 There’s a lot of hoopla in the media about how Millennials are the best-educated generation in history, blah, blah, blah. But according to a Pew survey, that’s a distortion of reality. In fact, two-thirds of Millennials between ages 25 and 32 don’t have a bachelor’s degree. The education gap among this generation is higher than for any other in history in terms of how those with a college degree will fare compared to those without. Reflecting a trend that has been gaining momentum in the rest of America, Millennials are rapidly getting sorted into winners and losers. Most of them are losing. That’s going to cost this generation a lot —and the rest of society, too.

According to Pew, young college graduates are ahead of their less-educated peers on just about every measure of economic well-being and how they are faring in the course of their careers. Their parents and grandparents’ generations did not take as big of a hit by not going to college, but for Millennials, the blow is severe. Without serious intervention, its effects will be permanent.

Young college grads working full-time are earning an eye-popping $17,500 more per year than those with only a high school diploma. To put this in perspective, in 1979 when the first Baby Boomers were the same age that Millennials are today, a high school graduate could earn around three-quarters (77 percent) of what his or her college-educated peer took in. But Millennials with only a high school diploma earn only 62 percent of what the college grads earn.

According to Pew, young people with a college degree are also more likely to have full-time jobs, much more likely to have a job of any kind, and more likely to believe that their job will lead to a fulfilling career. But forty-two percent of those with a high school diploma or less see their work as “just a job to get by.” In stark contrast, only 14 percent of college grads have such a negative assessment of their jobs.

Granted, college is expensive. But nine out of 10 Millennials say it’s worth it — even those who have had to borrow to foot the bill. They seem to have absorbed the fact that in a precarious economy, a college diploma is the bare minimum for security and stability.

Why are those with less education doing so badly? The Great Recession is part of the answer. There has also been a trend in which  jobs, when they return after a financial crisis, are worse than those that were lost. After the recession of the 80s, for example, unionized labor never again found jobs as good as the ones they’d had before the downturn. The same things has happened this time, only even more dramatically. The jobs that are returning are often part-time, underpaid, lacking in benefits and short on opportunities to advance. It’s great to embark on a career as an engineer at Apple, not so great to work in an Apple retail store, where pay is low and the hope for a career is minimal. The Great Recession amplified a trend of McJobs that had been gaining strength for decades, stoked by the decline in unions, deregulation, outsourcing, and poor corporate governance that have tilted the balance of power away from employees to such a degree that many young people now expect exploitation and poor conditions on the job simply as a matter of course, with no experience of how things could be any different.

All this is not to say that having a college degree gives you a free pass: This generation of college-educated adults is doing slightly worse on certain measures, like the percentage without jobs, than Gen Xers, Baby Boomers or members of the silent generation when they were in their mid-20s and early 30s. But today’s young people who don’t go to college are doing much worse than those in similar situations in the generations that came before.

Povety is one of the biggest threats to Millenials without college degrees. Nearly a quarter (22 percent) of young people ages 25 to 32 without a college degree live in poverty today, whereas only 6 percent of the college-educated fall into this camp. When Baby Boomers were the same age as today’s Millenials, only 7 percent of those with only a high school diploma were living in poverty.

It’s true that more Millennials than past generations have college degrees, and it’s also true that the value of those diplomas has increased. Given those facts you might think might that the Millennial generation should be earning more than earlier generations of young adults. You would be wrong — and that’s because it’s more costly not to have a college education than ever before. So the education have-nots are pulling the average of the whole generation down. The typical high school graduate’s earnings dropped by more than $3,000, from $31,384 in 1965 to $28,000 in 2013.

There are also more Millennials who don’t even have a high school diploma than previous generations: Some have taken to calling Millennials “Generation Dropout.” A 2013 article in the Atlantic Monthly noted that compared to other countries, the newest wave of employees is actually less educated than their parents because of the lower number completing high school. A recent program on NPR called the 25- to 32-year-old cohort without college degrees and decent jobs the “Silent Majority.”

In 1965, young college graduates earned $7,499 more than those with a high school diploma. But the earnings gap by educational attainment has steadily widened since then, and today it has more than doubled to $17,500 among Millennials ages 25 to 32.

All of this is alarming because it means that less-educated workers are going to have a really hard time. Compared to the Silent Generation, those with high school or less are three times more likely to be jobless.

When you look at the length of time the typical job seeker spends looking for work, less educated Millennials are again faring poorly. In 2013 the average unemployed college-educated Millennial had been looking for work for 27 weeks—more than double the time it took an unemployed college-educated 25- to 32-year-old in 1979 to find a job (12 weeks). And again, today’s young high school graduates do worse on this measure compared to the college-educated or their peers in earlier generations. Millennial high school graduates spend, on average, four weeks longer looking for work than college graduates (31 weeks vs. 27 weeks).

These young people are ending up in dire straits — stuck in debt, unable to set up their own households, and having to put off starting families (recent research shows that many women who face economic hard times in their 20s will never end up having kids). It’s not that they don’t want to grow up, it’s that they don’t have access to the things that make independence possible, like a good education, a good job, a strong social safety net, affordable childcare, and so on.

How much is this going to cost America as a nation? It’s too early to say for sure, but Millennial underemployment, which is directly linked to undereducation, is already costing $25 billion a year, largely because of the lost tax revenue. But what about the other costs? The increased rates of alcoholism and substance abuse? The broken relationships? The depression? The long list of physical ailments that go along with the stress of not being able to gain and keep a financial foothold?

Once upon a time, more forward-thinking politicians and politicos recognized that young people who have the bad luck to try to launch into adulthood in the wake of an economic crisis not of their own making need real help. They need jobs programs, training and decent work conditions that could improve not only their individual lives but the health of the whole society and economy. We have the blueprint of how to do this from the New Deal. It’s going to cost everyone if America leaves these young people to suffer this cruel fate.

Lynn Parramore is an AlterNet senior editor. She is cofounder of Recessionwire, founding editor of New Deal 2.0, and author of “Reading the Sphinx: Ancient Egypt in Nineteenth-Century Literary Culture.” She received her Ph.D. in English and cultural theory from NYU. She is the director of AlterNet’s New Economic Dialogue Project. Follow her on Twitter @LynnParramore.

http://www.alternet.org/education/surprise-majority-millennials-dont-have-college-degree-thats-going-cost-everybody?akid=12378.265072.6qEBLL&rd=1&src=newsletter1023736&t=7&paging=off&current_page=1#bookmark

Marijuana’s History: How One Plant Spread Through the World

Follow

Get every new post delivered to your Inbox.

Join 1,594 other followers