New report details depth of hunger crisis in the United States

http://anticap.files.wordpress.com/2009/11/11-20-end-of-hunger-in-us1.jpg?w=439&h=346

By Shannon Jones
19 August 2014

A new study by the non-profit agency Feeding America reveals the extent and depth of the hunger crisis in the United States.

According to the report, “Hunger in America 2014,” about one in seven in the US, 46 million people, rely on food banks in order to feed themselves and their families. The number includes 12 million children and 7 million senior citizens. The findings are based on responses from a survey of 60,000 recipients of food aid from the 200 food banks that are affiliated with Feeding America.

The survey was conducted between April and August 2013 and thus does not reflect the impact of more recent cuts carried out by the Obama administration to the Supplemental Nutrition Assistance Program (SNAP), commonly referred to as food stamps. In February Obama signed legislation slashing $8.7 billion over ten years from the federal food stamp program. It followed cuts in November that slashed food assistance by $319 per year for a typical family of three.

Last year the US government reported that a record number of Americans, more than 47 million, were relying on food stamp benefits. According to Feeding America only 55 percent of those receiving aid through its affiliated food banks are also enrolled in the SNAP program.

The persistence of high levels of food insecurity, more than five years after the economic crash of 2008-2009, demonstrates that talk of an economic recovery is a fraud. In fact hardship is growing among wide layers of the population, including those who are working and attending school.

The numbers presented by the survey suggest that the problem of hunger in America is broader than suggested by the numbers for food stamp enrollment alone, and could embrace as much as 20 percent of the US population.

The Feeding America survey revealed a number of startling facts. According to the report 69 percent of respondents said they have to choose between food and paying utility bills, and 66 percent said they have to choose between food and medical care. Another 31 percent said they had to choose between food and education.

Among those who are having difficulty getting enough food to eat, one of the most common coping strategies is to purchase less healthy, cheaper foods in order to stretch dollars. Such foods, which often contain high levels of sugar and sodium, can contribute to a multitude of health problems including obesity, heart disease and diabetes.

The survey noted the high incidence of individuals with health problems among those seeking food assistance. Nearly 47 percent of those responding said they had fair or poor health. It found that 58 percent of households had a member with high blood pressure and 38 percent had a member with diabetes. Some 29 percent of households reported that no member had access to health insurance, including Medicare and Medicaid.

According to the survey 39 percent of respondent households include at least one child, a higher rate than the general population (32 percent). Six percent reported both children and seniors living in the same household.

Significantly, the study found that 54 percent of households had a least one member who was employed in the past year. The rate is even higher for households with children, 71 percent. In addition, many households reported members with education beyond high school, including some with two- and four-year college degrees. About 21 percent reported attending or graduating from college.

Some 43 percent of Feeding American clients are white. Twenty-six percent are African American and 20 percent Latino. Nearly 15 percent of client households are multi-racial.

Requests for food assistance are up even for those serving in the US military, the employer of last resort for many working-class youth. The study found that almost 620,000 households enrolled in Feeding America programs had at least one family member currently in the US military. That figure amounts to 25 percent of all US military households.

Another striking statistic contained in the report relates to the large numbers of college students seeking food assistance. According to report, 10 percent of adult aid recipients are students, including two million full-time and one million part-time students. About two million Feeding America clients are students attending school full-time.

Burdened with crushing loads of student debt and unable to find decent paying work, many college students are turning to food pantries. In fact more and more colleges and universities have opened food pantries in response to growing problem of food insecurity among students.

Hunger inhibits learning in many ways, by reducing concentration and inhibiting the ability to retain material. Some students even have to forego purchasing course textbooks in order to pay for food.

According to researchers at Oregon State University, 59 percent of the students at Western Oregon University, a liberal arts college located in Monmouth, Oregon, are food insecure. A 2011 survey at the City University of New York found that 39.2 percent of the system’s 250,000 undergraduates had experienced food insecurity at some time in the past year.

Between 2007 and 2011 the number of food stamp recipients holding doctoral degrees tripled, according to a report in the 2012 Chronicle of Higher Education. At Michigan State University the on-campus food pantry reports that more than 50 percent of its clients are graduate students.

The Carnage of Capitalism



Capitalism is expanding like a tumor in the body of American society, spreading further into vital areas of human need like health and education.

Photo Credit: JoeBakal/Shutterstock.com

Capitalism is expanding like a tumor in the body of American society, spreading further into vital areas of human need like health and education.

Milton Friedman said in 1980: “The free market system distributes the fruits of economic progress among all people.” The father of the modern neoliberal movementcouldn’t have been more wrong. Inequality has been growing for 35 years, worsening since the 2008 recession, as a few well-positioned Americans have made millions while the rest of us have gained almost nothing. Now, our college students and medicine-dependent seniors have become the source of new riches for the profitseeking free-marketers.

Higher Education: Administrators Get Most of the Money

College grads took a 19 percent pay cut in the two years after the recession. By 2013 over half of employed black recent college graduates were working in occupations that typically do not require a four-year college degree. For those still in school, tuition has risen much faster than any other living expense, and the average student loan balance has risen 91 percent over the past ten years.

At the other extreme is the winner-take-all free-market version of education, with a steady flow of compensation towards the top. Remarkably, and not coincidentally, as inequality has surged since the 1980s, the number of administrators at private universities has doubled. Administrators now outnumber faculty on every campusacross the country.

These administrators are taking the big money. As detailed by Lawrence Wittner, the 25 highest-paid presidents increased their salaries by a third between 2009 and 2012, to nearly a million dollars each. For every million-dollar public university president in 2011, there were fourteen such presidents at private universities, and dozens of lower-level administrators aspiring to be paid like their bosses. At Purdue, for example, the 2012 administrative ranks included a $313,000-a-year acting provost, a $198,000 chief diversity officer, a $253,000 marketing officer and a $433,000 business school chief.

All this money at the top has to come from somewhere, and that means from faculty and students. Adjunct and student teachers, who made up about 22 percent of instructional staff in 1969, now make up an estimated 76 percent of instructional staff in higher education, with a median wage in 2010 of about $2,700 per course. More administrative money comes from tuition, which has increased by over 1,000 percent since 1978.

At the for-profit colleges, according to a Senate report on 2009 expenses, education companies spent about 23 percent of all revenue on marketing and advertising, and almost 20 percent of revenue on pre-tax profits for their shareholders. They spent just 17.2 percent of their revenue on instruction.

Medicine: A 10,000 Percent Profit for Corporations

As with education, the extremes forced upon us by free-market health care are nearly beyond belief. First, at the human end, 43 percent of sick Americans skipped doctor’s visits and/or medication purchases in 2011 because of excessive costs. It’s estimatedthat over 40,000 Americans die every year because they can’t afford health insurance.

At the corporate end, drugmakers are at times getting up to $100 for every $1 spent. That’s true at Gilead Sciences, the manufacturer of the drug Sovaldi, which charges about $10 a pill to its customers in Egypt, then comes home to charge $1,000 a pill to its American customers. The 10,000 percent profit is also true with the increasingly lucrative, government-funded Human Genome Project, which is estimated to potentially return about $140 for every $1 spent. Big business is quickly making its move. Celera GenomicsAbbott LabsMerckRocheBristol-Myers Squibb, andPfizer are all starting to cash in.

The extremes of capitalist greed are evident in the corporate lobbying of Congress to keep Medicare from negotiating better drug prices for the American consumer. Americans are cheated further when corporations pay off generic drug manufacturers to delay entry of their products into the market, thereby ensuring inflated profits for the big firms for the durations of their shady deals.

Global Greed

Lives are being ravaged by unregulated, free-market capitalism, in the U.S. and around the world. According to the Global Forum for Health Research, less than 10 percent of the global health research budget is spent on the conditions responsible for 90 percent of human disease.

And the greed is getting worse. Perhaps it’s our irrational fear of socialism, peaking in the years after World War 2, that has inspired our winner-take-all culture. In the Reagan era we listened to Margaret Thatcher proclaim that “There is no such thing as society.”

In a more socially-conscious time, in 1955, after Dr. Jonas Salk had developed the polio vaccine, he was asked by reporter Edward R. Murrow: “Who owns the patent on this vaccine?” Responded Salk, “Well, the people, I would say. There is no patent. Could you patent the sun?”

A free-market capitalist might remind us that a skillful hedge fund manager can make as much as a thousand Jonas Salks.

 

Paul Buchheit teaches economic inequality at DePaul University. He is the founder and developer of the Web sites UsAgainstGreed.org, PayUpNow.org and RappingHistory.org, and the editor and main author of “American Wars: Illusions and Realities” (Clarity Press). He can be reached at paul@UsAgainstGreed.org.

http://www.alternet.org/economy/carnage-capitalism?akid=12138.265072.pC6w-o&rd=1&src=newsletter1015885&t=13&paging=off&current_page=1#bookmark

Facebook, email and the neuroscience of always being distracted

I used to be able to read for hours without digital interruption. Now? That’s just funny. I want my focus back!

"War and Peace" tortured me: Facebook, email and the neuroscience of always being distracted
This essay is adapted from “The End of Absence”

I’m enough of a distraction addict that a low-level ambient guilt about not getting my real work done hovers around me for most of the day. And this distractible quality in me pervades every part of my life. The distractions—What am I making for dinner?, Who was that woman in “Fargo”?, or, quite commonly, What else should I be reading?—are invariably things that can wait. What, I wonder, would I be capable of doing if I weren’t constantly worrying about what I ought to be doing?

And who is this frumpy thirty-something man who has tried to read “War and Peace” five times, never making it past the garden gate? I took the tome down from the shelf this morning and frowned again at those sad little dog-ears near the fifty-page mark.

Are the luxuries of time on which deep reading is reliant available to us anymore? Even the attention we deign to give to our distractions, those frissons, is narrowing.

It’s important to note this slippage. As a child, I would read for hours in bed without the possibility of a single digital interruption. Even the phone (which was anchored by wires to the kitchen wall downstairs) was generally mute after dinner. Our two hours of permitted television would come to an end, and I would seek out the solitary refuge of a novel. And deep reading (as opposed to reading a Tumblr feed) was a true refuge. What I liked best about that absorbing act was the fact books became a world unto themselves, one that I (an otherwise powerless kid) had some control over. There was a childish pleasure in holding the mysterious object in my hands; in preparing for the story’s finale by monitoring what Austen called a “tell-tale compression of the pages”; in proceeding through some perfect sequence of plot points that bested by far the awkward happenstance of real life.

The physical book, held, knowable, became a small mental apartment I could have dominion over, something that was alive because of my attention and then lived in me.

But now . . . that thankful retreat, where my child-self could become so lost, seems unavailable to me. Today there is no room in my house, no block in my city, where I am unreachable.

Eventually, if we start giving them a chance, moments of absence reappear, and we can pick them up if we like. One appeared this morning, when my partner flew to Paris. He’ll be gone for two weeks. I’ll miss him, but this is also my big break.



I’ve taken “War and Peace” back down off the shelf. It’s sitting beside my computer as I write these lines—accusatory as some attention-starved pet.

You and me, old friend. You, me, and two weeks. I open the book, I shut the book, and I open the book again. The ink swirls up at me. This is hard. Why is this so hard?

* * *

Dr. Douglas Gentile, a friendly professor at Iowa State University, recently commiserated with me about my pathetic attention span. “It’s me, too, of course,” he said. “When I try to write a paper, I can’t keep from checking my e-mail every five minutes. Even though I know it’s actually making me less productive.” This failing is especially worrying for Gentile because he happens to be one of the world’s leading authorities on the effects of media on the brains of the young. “I know, I know! I know all the research on multitasking. I can tell you absolutely that everyone who thinks they’re good at multitasking is wrong. We know that in fact it’s those who think they’re good at multitasking who are the least productive when they multitask.”

The brain itself is not, whatever we may like to believe, a multitasking device. And that is where our problem begins. Your brain does a certain amount of parallel processing in order to synthesize auditory and visual information into a single understanding of the world around you, but the brain’s attention is itself only a spotlight, capable of shining on one thing at a time. So the very word multitask is a misnomer. There is rapid-shifting minitasking, there is lame-spasms-of-effort-tasking, but there is, alas, no such thing as multitasking. “When we think we’re multitasking,” says Gentile, “we’re actually multiswitching.”

We can hardly blame ourselves for being enraptured by the promise of multitasking, though. Computers—like televisions before them—tap into a very basic brain function called an “orienting response.” Orienting responses served us well in the wilderness of our species’ early years. When the light changes in your peripheral vision, you must look at it because that could be the shadow of something that’s about to eat you. If a twig snaps behind you, ditto. Having evolved in an environment rife with danger and uncertainty, we are hardwired to always default to fast-paced shifts in focus. Orienting responses are the brain’s ever-armed alarm system and cannot be ignored.

Gentile believes it’s time for a renaissance in our understanding of mental health. To begin with, just as we can’t accept our body’s cravings for chocolate cake at face value, neither can we any longer afford to indulge the automatic desires our brains harbor for distraction.

* * *

It’s not merely difficult at first. It’s torture. I slump into the book, reread sentences, entire paragraphs. I get through two pages and then stop to check my e-mail—and down the rabbit hole I go. After all, one does not read “War and Peace” so much as suffer through it. It doesn’t help that the world at large, being so divorced from such pursuits, is often aggressive toward those who drop away into single-subject attention wells. People don’t like it when you read “War and Peace.” It’s too long, too boring, not worth the effort. And you’re elitist for trying.

In order to finish the thing in the two weeks I have allotted myself, I must read one hundred pages each day without fail. If something distracts me from my day’s reading—a friend in the hospital, a magazine assignment, sunshine—I must read two hundred pages on the following day. I’ve read at this pace before, in my university days, but that was years ago and I’ve been steadily down-training my brain ever since.

* * *

Another week has passed—my “War and Peace” struggle continues. I’ve realized now that the subject of my distraction is far more likely to be something I need to look at than something I need to do. There have always been activities—dishes, gardening, sex, shopping—that derail whatever purpose we’ve assigned to ourselves on a given day. What’s different now is the addition of so much content that we passively consume.

Only this morning I watched a boy break down crying on “X Factor,” then regain his courage and belt out a half-decent rendition of  Beyoncé’s “Listen”; next I looked up the original Beyoncé video and played it twice while reading the first few paragraphs of a story about the humanity of child soldiers; then I switched to a Nina Simone playlist prepared for me by Songza, which played while I flipped through a slide show of American soldiers seeing their dogs for the first time in years; and so on, ad nauseam. Until I shook I out of this funk and tried to remember what I’d sat down to work on in the first place.

* * *

If I’m to break from our culture of distraction, I’m going to need practical advice, not just depressing statistics. To that end, I switch gears and decide to stop talking to scientists for a while; I need to talk to someone who deals with attention and productivity in the so-called real world. Someone with a big smile and tailored suits such as organizational guru Peter Bregman. He runs a global consulting firm that gets CEOs to unleash the potential of their workers, and he’s also the author of the acclaimed business book 18 Minutes, which counsels readers to take a minute out of every work hour (plus five minutes at the start and end of the day) to do nothing but set an intention.

Bregman told me he sets his watch to beep every hour as a reminder that it’s time to right his course again. Aside from the intention setting, Bregman counsels no more than three e-mail check-ins a day. This notion of batch processing was anathema to someone like me, used to checking my in-box so constantly, particularly when my work feels stuck. “It’s incredibly inefficient to switch back and forth,” said Bregman, echoing every scientist I’d spoken to on multitasking. “Besides, e-mail is, actually, just about the least efficient mode of conversation you can have. And what we know about multitasking is that, frankly, you can’t. You just derail.”

“I just always feel I’m missing something important,” I said. “And that’s precisely why we lose hours every day, that fear.” Bregman argues that it’s people who can get ahead of that fear who end up excelling in the business world that he spends his own days in. “I think everyone is more distractible today than we used to be. It’s a very hard thing to fix. And as people become more distracted, we know they’re actually doing less, getting less done. Your efforts just leak out. And those who aren’t—aren’t leaking—are going to be the most successful.”

I hate that I leak. But there’s a religious certainty required in order to devote yourself to one thing while cutting off the rest of the world. We don’t know that the inbox is emergency-free, we don’t know that the work we’re doing is the work we ought to be doing. But we can’t move forward in a sane way without having some faith in the moment we’ve committed to. “You need to decide that things don’t matter as much as you might think they matter,” Bregman suggested as I told him about my flitting ways. And that made me think there might be a connection between the responsibility-free days of my youth and that earlier self’s ability to concentrate. My young self had nowhere else to be, no permanent anxiety nagging at his conscience. Could I return to that sense of ease? Could I simply be where I was and not seek out a shifting plurality to fill up my time?

* * *

It happened softly and without my really noticing.

As I wore a deeper groove into the cushions of my sofa, so the book I was holding wore a groove into my (equally soft) mind. Moments of total absence began to take hold more often; I remembered what it was like to be lost entirely in a well-spun narrative. There was the scene where Anna Mikhailovna begs so pitifully for a little money, hoping to send her son to war properly dressed. And there were, increasingly, more like it. More moments where the world around me dropped away and I was properly absorbed. A “causeless springtime feeling of joy” overtakes Prince Andrei; a tearful Pierre sees in a comet his last shimmering hope; Emperor Napoleon takes his troops into the heart of Russia, oblivious to the coming winter that will destroy them all…

It takes a week or so for withdrawal symptoms to work through a heroin addict’s body. While I wouldn’t pretend to compare severity here, doubtless we need patience, too, when we deprive ourselves of the manic digital distractions we’ve grown addicted to.

That’s how it was with my Tolstoy and me. The periods without distraction grew longer, I settled into the sofa and couldn’t hear the phone, couldn’t hear the ghost-buzz of something else to do. I’m teaching myself to slip away from the world again.

* * *

Yesterday I fell asleep on the sofa with a few dozen pages of “War and Peace” to go. I could hear my cell phone buzzing from its perch on top of the piano. I saw the glowing green eye of my Cyclops modem as it broadcast potential distraction all around. But on I went past the turgid military campaigns and past the fretting of Russian princesses, until sleep finally claimed me and my head, exhausted, dreamed of nothing at all. This morning I finished the thing at last. The clean edges of its thirteen hundred pages have been ruffled down into a paper cabbage, the cover is pilled from the time I dropped it in the bath. Holding the thing aloft, trophy style, I notice the book is slightly larger than it was before I read it.

It’s only after the book is laid down, and I’ve quietly showered and shaved, that I realize I haven’t checked my e-mail today. The thought of that duty comes down on me like an anvil.

Instead, I lie back on the sofa and think some more about my favorite reader Milton – about his own anxieties around reading. By the mid-1650s, he had suffered that larger removal from the crowds, he had lost his vision entirely and could not read at all—at least not with his own eyes. From within this new solitude, he worried that he could no longer meet his potential. One sonnet, written shortly after the loss of his vision, begins:

When I consider how my light is spent,

Ere half my days, in this dark world and wide, and that one Talent

which is death to hide Lodged with me useless . . .

Yet from that position, in the greatest of caves, he began producing his greatest work. The epic “Paradise Lost,” a totemic feat of concentration, was dictated to aides, including his three daughters.

Milton already knew, after all, the great value in removing himself from the rush of the world, so perhaps those anxieties around his blindness never had a hope of dominating his mind. I, on the other hand, and all my peers, must make a constant study of concentration itself. I slot my ragged “War and Peace” back on the shelf. It left its marks on me the same way I left my marks on it (I feel awake as a man dragged across barnacles on the bottom of some ocean). I think: This is where I was most alive, most happy. How did I go from loving that absence to being tortured by it? How can I learn to love that absence again?

This essay is adapted from “The End of Absence” by Michael Harris, published by Current / Penguin Random House.

 

http://www.salon.com/2014/08/17/war_and_peace_tortured_me_facebook_email_and_the_neuroscience_of_always_being_distracted/?source=newsletter

Face Time: Eternal Youth Has Become a Growth Industry in Silicon Valley

Tuesday, Aug 12 2014

The students of Timothy Draper’s University of Heroes shuffle into a conference room, khaki shorts swishing against their knees, flip-flops clacking against the carpeted floor. One by one they take their seats and crack open their laptops, training their eyes on Facebook home pages or psychedelic screen savers. An air conditioner whirs somewhere in the rafters. A man in chinos stands before them.

The man is Steve Westly, former state controller, prominent venture capitalist, 57-year-old baron of Silicon Valley. He smiles at the group with all the sheepishness of a student preparing for show-and-tell. He promises to be brief.

“People your age are changing the world,” Westly tells the students, providing his own list of great historical innovators: Napoleon, Jesus, Zuckerberg, Larry, Sergey. “It’s almost never people my age,” he adds.

Students at Draper University — a private, residential tech boot camp launched by venture capitalist Timothy Draper, in what was formerly San Mateo’s Benjamin Franklin Hotel — have already embraced Westly’s words as a credo. They inhabit a world where success and greatness seem to hover within arm’s reach. A small handful of those who complete the six-week, $9,500 residential program might get a chance to join Draper’s business incubator; an even smaller handful might eventually get desks at an accelerator run by Draper’s son, Adam. It’s a different kind of meritocracy than Westly braved, pursuing an MBA at Stanford in the early ’80s. At Draper University, heroism is merchandised, rather than earned. A 20-year-old with bright eyes and deep pockets (or a parent who can front the tuition) has no reason to think he won’t be the next big thing.

This is the dogma that glues Silicon Valley together. Young employees are plucked out of high school, college-aged interns trade their frat houses and dorm rooms for luxurious corporate housing. Twenty-seven-year-old CEOs inspire their workers with snappy jingles about moving fast and breaking things. Entrepreneurs pitch their business plans in slangy, tech-oriented patois.

Gone are the days of the “company man” who spends 30 years ascending the ranks in a single corporation. Having an Ivy League pedigree and a Brooks Brothers suit is no longer as important.

“Let’s face it: The days of the ‘gold watch’ are over,” 25-year-old writer David Burstein says. “The average millennial is expected to have several jobs by the time he turns 38.”

Yet if constant change is the new normal, then older workers have a much harder time keeping up. The Steve Westlys of the world are fading into management positions. Older engineers are staying on the back-end, working on system administration or architecture, rather than serving as the driving force of a company.

“If you lost your job, it might be hard to find something similar,” a former Google contractor says, noting that an older engineer might have to settle for something with a lower salary, or even switch fields. The contractor says he knows a man who graduated from Western New England University in the 1970s with a degree in the somewhat archaic field of time-motion engineering. That engineer wound up working at Walmart.

Those who do worm their way into the Valley workforce often have a rough adjustment. The former contractor, who is in his 40s, says he was often the oldest person commuting from San Francisco to Mountain View on a Google bus. And he adhered to a different schedule: Wake up at 4:50 a.m., get out the door by 6:20, catch the first coach home at 4:30 p.m. to be home for a family supper. He was one of the few people who didn’t take advantage of the free campus gyms or gourmet cafeteria dinners or on-site showers. He couldn’t hew to a live-at-work lifestyle.

And compared to other middle-aged workers, he had it easy.

In a lawsuit filed in San Francisco Superior Court in July, former Twitter employee Peter H. Taylor claims he was canned because of his age, despite performing his duties in “an exemplary manner.” Taylor, who was 57 at the time of his termination in September of last year, says his supervisor made at least one derogatory remark about his age, and that the company refused to accommodate his disabilities following a bout with kidney stones. He says he was ultimately replaced by several employees in their 20s and 30s. A Twitter spokesman says the lawsuit is without merit and that the company will “vigorously” defend itself.

The case is not without precedent. Computer scientist Brian Reid lobbed a similar complaint against Google in 2004, claiming co-workers called him an “old man” and an “old fuddy-duddy,” and routinely told him he was not a “cultural fit” for the company. Reid was 54 at the time he filed the complaint; he settled for an undisclosed amount of money.

What is surprising, perhaps, is that a 57-year-old man was employed at Twitter at all. “Look, Twitter has no 50-year-old employees,” the former Google contractor says, smirking. “By the time these [Silicon Valley] engineers are in their 40s, they’re old — they have houses, boats, stock options, mistresses. They drive to work in Chevy Volts.”

There’s definitely a swath of Valley nouveau riche who reap millions in their 20s and 30s, and who are able to cash out and retire by age 40. But that’s a minority of the population. The reality, for most people, is that most startups fail, most corporations downsize, and most workforces churn. Switching jobs every two or three years might be the norm, but it’s a lot easier to do when you’re 25 than when you’re 39. At that point, you’re essentially a senior citizen, San Francisco botox surgeon Seth Matarasso says.

“I have a friend who lived in Chicago and came back to Silicon Valley at age 38,” Matarasso recalls. “And he said, ‘I feel like a grandfather — in Chicago I just feel my age.”

Retirement isn’t an option for the average middle-aged worker, and even the elites — people like Westly, who were once themselves wunderkinds — find themselves in an awkward position when they hit their 50s, pandering to audiences that may have no sense of what came before. The diehards still work well past their Valley expiration date, but then survival becomes a job unto itself. Sometimes it means taking lower-pay contract work, or answering to a much younger supervisor, or seeking workplace protection in court.

CONTINUED: http://www.sfweekly.com/sanfrancisco/silicon-valley-bottom-age-discrimination/Content?oid=3079530

Robin Williams and the Virtues of Suicide

 

Laughter in the Dark

http://static.guim.co.uk/sys-images/Guardian/Pix/pictures/2010/9/17/1284739609016/Robin-Williams.-006.jpg

by BINOY KAMPMARK

Now let the reader’s own moral feelings decide as to whether or not suicide is a criminal act.”

– Arthur Schopenhauer, On Suicide, in Studies in Pessimism

The Grim Reaper never runs out of converts. Put another way, death never gets his full due. Comedians do not figure well in this – they are particularly attractive targets in the business of death.  Ironic, then, that clowns are sometimes hired to make the ill in hospitals laugh, to give the impression that the world is not as dark as all that.

The more one is engaged in the business of making one laugh, the more one is taken from. It is a well that never runs dry.  There is much to be said that the jovial one is the creative giver, and the one who laughs in response is a pirate of emotions, a recipient, yes, but a parasitic one.  While we will never know the extent of what a comedian like Robin Williams was going through when he took his life, a tendency of exploitation is all too strong. The comedian is doomed to suffer, and when that life is taken by the joker’s own hand, questions will be asked.

The casualty list for such noble people is high, and it comes with its fair share of ailments – alcoholism, drugs, the softening blow of dejection.  Tony Hancock, one of Britain’s finest, found himself mocking the Australian society he visited near the end of his life with well-targeted viciousness.  There was a sense that he was coming to his end, brooding in the twilight of his years. He died in a Sydney flat in June, 1968. He did leave grief, but he left a stunning record of humour.

For American audiences, John Belushi is cited as an example of one the reaper carried off the day the comedy died.  Ironically, Belushi’s own death in 1982 after a combination of heroin and cocaine in his Chateau Marmont room in Hollywood was something of a spur for Williams to get clean.  No slate, however, is ever clean. The prospect of taking one’s own life is always up the sleeve. Nor should it ever be removed. It is the ultimate play.

Grief, when it is total, untapped and uncontrolled, may leave one suspicious. Williams was evidently admired, though throwing words such as love around is not necessarily a wise thing.  It is invariably compensation for what has passed – we did not think enough of him when he was alive.  It says much that, at the end his obituary in the Sydney Morning Herald[1], a helpline number is featured just in case people might get funny ideas.  “Support is available for those who may be distressed by phoning Lifeline”.  (Mensline and Kids Helpline also feature.)  The more one talks about not merely suicide as a reality, the squeamish start taking over. Whatever you do – don’t do it!

The undertone here is one of ingratitude – you were selfish to leave the land of the living, and in so doing, made sure the lights went out for others.  Suicide, which has been practiced since the human condition became aware to humans, is thereby delegitimised. It is somehow wrong to take one’s own life, especially if you are supposedly loved.  Society itself is a barbarism that remains decidedly against the suicidal.

While the scolding element has not been all too evident with Williams, there is a suggestion that he should have shown more concern for his brigades and platoons of anonymous admirers.  Behind the message of love is often a chiding note.  Tributes are not necessarily unqualified in their affection. Why didn’t you, bastard, leave us more jokes, or at the very least, live longer?  We do not know you, but you owed us that.

Philosophers have been busy on the subject of suicide for centuries. A main target of this came from organised religion.  The paradox of most religions, but notably those of the Book, is that suicide is most of all irresponsible and disgraceful to the glory of life.  As Arthur Schopenhauer explained, “none but the votaries of monotheistic, that is to say, Jewish religions, look upon suicide as a crime.”[2]  The Enlightenment thinkers were scathing in their criticism of this tunnel visioned view.

That said, Immanuel Kant did keep a salvo there for those who felt that self-inflicted death was an option.  Suicide, he argued was “in no circumstances permissible”.  To commit suicide sees the man who does so sink “lower than the beasts”.  A curious view, given that human beings are prone to collective suicide on a constant basis (war, economic ruin, and a myriad of other examples).

Again, as Schopenhauer explained, suicide would lead to a cruel appropriation of the dead person’s property, an “ignominious burial”, and a verdict of “insanity”.  This, he pointed out with scathing observation, was most prevalent in England.  How absurd, then, to compare the issue of someone who met a “voluntary death” with that of someone who inflicted a death with voluntary inclination.

While it is true that Williams has received many condolences and tributes, the love simply flows too far, drying even as it flows. Instead, his decision should be respected, however desperate it was when taken.  Suicide is often misunderstood and almost always underrated.  What should matter is the oeuvre, the body of work that says everything about a man who did something all too easy to ignore: make people laugh.  His humour was volcanic, and it bubbled and steamed till death.  Treasure him, if nothing else, for that.

Dr. Binoy Kampmark was a Commonwealth Scholar at Selwyn College, Cambridge.  He lectures at RMIT University, Melbourne.  Email: bkampmark@gmail.com

 

 

http://www.counterpunch.org/2014/08/13/robin-williams-and-the-virtues-of-suicide/

Liberal, environmentalist organizations have a race problem

There are two environmentalist movements: One for rich whites, the other for poor minorities

There are two environmentalist movements: One for rich whites, the other for poor minorities
(Credit: NASA)

A report released in July has unfortunate news for the environmental movement: Thus far, it has failed pitifully in promoting diverse workplaces. The study culled data from three different kinds of environmental institutions — 191 conservation organizations, 74 government agencies and 28 grant-making foundations — and found that while the institutions have made considerable progress combating gender discrimination (women represent 60 percent of new hires and interns and occupy more than half of all leadership positions in conservation organizations), they still have a way to go when it comes to racial and ethnic diversity. Even though almost 40 percent of Americans are minorities, they make up less than 16 percent of employees at environmental institutions.

The Washington Post’s Darryl Fears has more:

The research was undertaken by Dorceta Taylor, a professor in the School of Natural Resources and Environment at the University of Michigan. Taylor found that of the 3,140 paid staff members at large groups such as the Sierra Club, Natural Resources Defense Council, Nature Conservancy, Environmental Defense Fund, National Wildlife Federation and Earthjustice, 88 percent were white. On the boards that govern the groups, 95 percent of members were white …

Taylor wrote that an “unconscious bias” exists within the liberal and progressive culture of the groups, preserving a racially homogenous workplace. “Recruitment for new staff frequently occurs through word-of-mouth and informal networks,” the study said. “This makes it difficult for ethnic minorities, the working class, or anyone outside of traditional environmental networks to find out about job openings and apply for those jobs.”

Fears also writes of differences in the kinds of environmentalism promoted by various races:

The divide has resulted in two environmental movements. One is white and the other non-white, one rich and the other poor, one devoted largely to advocating on behalf of wilderness areas and the other for “environmental justice” in core urban areas where minorities tend to live.



The idea that poorer minorities don’t get to worry about protecting the magnificent natural world is not new. The entire concept of environmental justice is predicated on the reality that lower-income and minority communities often get saddled with unjust environmental costs. For example, toxic waste dumps and power plants are located in poorer neighborhoods, facts that are compounded by worse access to healthcare. As a result, residents of those poorer communities will become activists for environmental justice, while richer, white environmentalists can focus on broader issues of climate change and conservation.

Michel Gelobter, executive director of Redefining Progress, wrote that environmental justice has been so valuable because it offered “a home for activists who weren’t comfortable separating their concern over the state of the planet from their concerns about social justice.” Perhaps if more institutions took on a similar attitude, the diversity problem might begin to solve itself.

 

Joanna Rothkopf is an assistant editor at Salon, focusing on sustainability. Follow @JoannaRothkopf or email jrothkopf@salon.com.

http://www.salon.com/2014/08/11/there_are_two_environmentalist_movements_one_for_rich_whites_the_other_for_poor_minorities/?source=newsletter

5 Ways Positive Thinking Makes You Miserable


Being totally inauthentic doesn’t really do much for the psyche, or the office.

Photo Credit: Shutterstock.com/gulserinak1955

Some people take the pursuit of happiness too far:

One German IT company has come up with the perfect solution to problems in the workplace – it’s made cheerfulness a contractual obligation. What’s more, the CEO has declared that those who don’t measure up to the prescribed level of jollity in the morning should stay at home until they cheer up.

The idea of positive thinking (and therefore banning negativity) is not new, but is affecting us now more than ever – at home and at work. And ironically, its effects are mostly negative. Yes, forced positive thinking makes us less happy.

Positive thinking is a poorly defined concept which at its most extreme says, that in every situation you can choose your own mood and your own reactions. No matter what happens to you, you can always choose a positive attitude.

“Fake it til you make it,” they say, claiming that faking happiness actually makes you happier. Basically, if you don’t feel happy every moment of your life, it’s just your own damn fault for not trying hard enough.

Now, this idea is not completely unfounded. In many situations you can actually change your mood and outlook through conscious effort. Let’s say you’re stuck in traffic on your way to work. In a situation like that you can probably change your mindset and switch from being annoyed about the delay to a more positive interpretation of the situation. “Great, I have more time to listen to this interesting radio program” or whatever. Nothing wrong with that.

But the most fanatic proponents of positive thinking (especially fans of The Secret and similar pseudo-scientific nonsense) go much further. They claim that you can always change your thinking in any situation, and that external circumstances don’t matter. No matter what situation you’re in, they say, you can simply choose to be happy.

Tell that to someone who’s seriously ill, who’s just been fired or who is suffering from severe depression. Actually, you should never tell them that, because telling someone in a really rough life situation that they should think more positively is incredibly condescending and a terrible way to trivialize their pain.

You could say I positively hate positive thinking.

There is nothing wrong with the milder form of positive thinking, but the extreme version is bad for you in life and at work. Here are five ways positive thinking screws up our workplaces.

1: Faking emotions at work is stressful

In this fascinating TEDx speech, Danish researcher Mette Boll talks about an experiment she performed, in which grocery store employees were subjected to stressful situations on the job.

She found that the most stressful condition was not just having to deal with rude customers – the most stressful thing was to have to fake being positive as it happened.

She talks about biological authenticity and says that:

If you respond to an encounter with your systemic impulse, then there is no stress to your system…

If, however, you filter your response and you do what you’re trained to do in a situation like that, (the customer is always right), then it’s more stressful than anything else we could measure or detect in any of these scenarios.

So simply put, having to fake emotions you don’t have is stressful. That sort of demolishes the whole “Fake it till you make it” idea.

2: Positive thinking makes things even worse for people who are unhappy at work

According to the extreme version of positive thinking, if you’re unhappy at work, you’ve only got yourself to blame. It may be that your boss is a jerk, your coworkers bully you and the culture is completely toxic – you should just “have a positive attitude” and “make the best of it.”

So not only are you miserable, but now it’s all your fault. That makes things even worse.

3: Negative emotions are a natural part of work

Here’s a story I once shared from my previous career in IT consulting:

I had a big client in France who couldn’t make up their mind. In every single meeting, the customer changes the specs for the system. First they want this, then they want that. First they want it this way, then that way. Meanwhile, I’m quietly going crazy.

Finally, I lose it in a meeting. They introduce change number 283 (by my loose count), once again going back on what they’ve told me previously, and I snap. I actually pound the table with my fist, snap my folder shut and say through clenched teeth “No. This can’t go on. This system will never get off the ground if you keep changing your mind at every meeting. We need to make decisions and stick to them.”

From that moment on, they started taking me seriously and we got a much better working relationship.

In this situation I felt AND showed anger – a negative emotion. I could’ve forced myself to be positive in that situation, but it would have been a betrayal of my work and myself and it would have felt even worse. Not only was I authentic by being angry, that outburst finally got the client to respect me.

The thing to remember is that negative emotions are not called that because they’re wrong, but simply because they’re unpleasant. Sometimes a negative emotion is exactly the right emotion and if you’re always forcing yourself to be positive you’re being both less authentic and less effective.

When your circumstances are bad, there is nothing wrong with being unhappy; it is only natural. In fact, negative emotions tend to drive us to action more than positive ones, so feeling bad about a bad situations can help you do something about it.

4: Positive thinking can contribute to quelling dissent and ignoring problems in the workplace

Ever heard someone say “In this workplace we don’t have problems, only challenges”?

I hate that phrase with a passion, partly because it’s wrong but mostly because it’s so often used to stifle dissent and criticism.

No workplace is perfect. No job is without problems. If we consistently marginalize and criticize people who are unhappy at work by telling them to be positive and never complain, we lose some very valuable voices of reason and realism in the workplace.

5: Trying to force yourself to be positive, makes you unhappy

All of this means, ironically, that positive thinking at work makes us unhappy. If we expect to be happy all the time at work we are bound to be disappointed.

So let’s give negativity it’s central place in the workplace – as a perfectly natural, even helpful, state of mind. And that, ironically, will lead to more happiness at work!

Your take
Have you ever felt pressured to be happy at work when you weren’t? What did that do to you? What constructive role do you see negativity play at work? Write a comment, I’d love to hear your take.

A quick note:One thing that often bugs me is that some people confuse positive thinking with positive psychology. I base a lot of my work on positive psychology which is the branch of psychology that studies what makes people thrive and feel happy, where traditional psychology focuses mostly on mental illness. The only thing they have in common is the word “positive.”

 

http://www.alternet.org/economy/5-ways-positive-thinking-makes-you-miserable-work?akid=12113.265072.SLom9a&rd=1&src=newsletter1015019&t=15&paging=off&current_page=1#bookmark

Obama and the Revival of Wall Street

Obama Gets a (Political) Facelift

http://scottystarnes.files.wordpress.com/2011/10/alg_obama_wall_st13.jpg?w=640

 

by ROB URIE

 

Ahead of the upcoming mid-term elections there is an argument being put forward by Democrat apologists that President Barack Obama’s resuscitation of Wall Street at public expense and his wholly inadequate ACA (Affordable Care Act) are policy successes worthy of political support. The base tactic is to set up false choices and then argue that Mr. Obama chose among the better of the available alternatives. His unconditional resurrection of Wall Street is posed against an inevitable second Great Depression when the choice was between reviving the existing system of suicide finance or reining it in to serve a public purpose. And the problem in need of solving with the American health care system is a lack of access to health care for a substantial portion of the population. Increasing the intrusion of the insurance industry into the U.S. health care system, as the ACA does, builds an even higher wall between people and access to health care.

The base frame in support of the revival of Wall Street is that Mr. Obama had a choice of Great Depression 2.0 or the wholesale revival of Wall Street, however ‘distasteful’ the latter might have been. Left unsaid is that it is this very same Wall Street that destroyed the economies of the West through creating the housing boom / bust, that the only thing resolved since the onset of crisis is the incomes and wealth of the very richest, that the Federal government held all of the cards in 2008 – 2009 when Wall Street was at risk of collapse and that Mr. Obama put the very same Timothy Geithner and Larry Summers who promoted the bank interests that created the crisis in charge of covering it up. Nationalization of major Wall Street banks was put forward as a policy option in top-level discussions of resolving the crisis. The argument now being made (top link) that government agencies lacked the legal authority to do so ignores the practical circumstance that Wall Street was dependent on government largesse to avoid wholesale bankruptcy in 2008 – 2009 and that the ‘lacked legal authority’ argument was no where to be found in contemporaneous policy discussions.

urieobama1

Graph (1) above: both income and wealth dispersion have worsened under Mr. Obama’s tenure as President. Illustrated are the rise in relative (to the median) wealth of the very richest and the decline in relative wealth of the poorest. While the trend toward income and wealth inequality preceded Mr. Obama’s time in office, his policies have overwhelmingly benefited the very rich and left everyone else to their own devices. Rather than simply giving bailout funds to Wall Street the money could have been used to buy down negative equity for those duped into taking predatory loans to buy houses. Likewise the purchase by the Federal Reserve of $ 4 trillion in assets has overwhelmingly benefited the very richest by raising financial asset prices. Source: Russell Sage Foundation.

To be clear, the Obama administration’s choice of what to do with Wall Street wasn’t— was not, between consequence free bailouts and another Great Depression. And in fact the IMF (International Monetary Fund) spent the previous half-century asserting this very point when it came to closing dysfunctional banks overseas. Predatory banks draw economic resources away from well-functioning parts of national economies under nearly all theories of banking and finance. Claims that Obama administration policies (Dodd-Frank) have ‘reformed’ the financial system ignore global financial interconnectedness, multiple commitments of limited collateral and that not even government regulators believe that too-big-to-fail has been resolved.  A relevant question is where the economic resources will come from to resolve the $1 – $2 quadrillion in Swaps notional and other derivatives contracts that take precedence over all other claims, including depositor’s, in bankruptcy? And the contention that government subsidies of Wall Street have now ended demonstrates near complete ignorance of how financial markets function.

urieobama2

Graph (2) above: one of the more spurious contentions Democrat loyalists put forward is that current ‘calm’ in the form of rising stock markets and house prices amidst ongoing depression for a substantial proportion of the population indicates that Obama administration policies resolved anything? The current circumstance of high stock prices (relative to cyclically adjusted corporate earnings) and tight (low) credit spreads has been a sign of impending turmoil in the past. For statistical reasons (non-stationary credit spreads), the way to view the too-big-to-fail subsidy of Wall Street is over boom / bust cycles and not at a cyclical low point for credit spreads. Claiming that current conditions answer the question would look quite different if the ‘current condition’ was 2008 – 2009. And in fact, actual economic conditions for most Americans have continued to deteriorate as stock and housing market prices driven higher by Federal Reserve policies have revived the fortunes of the very richest. Source: St. Louis Fed, Moody’s.

urieobama3

Graph (3) above: Mr. Obama’s supporters claim that were it not for his consequence-free bailouts of Wall Street a second Great Depression would have occurred. What is illustrated above are the declines in real (inflation-adjusted) median household incomes and wealth both before and after Mr. Obama took office. For the bottom half of the population both income and wealth continued their declines after Mr. Obama took office.. This isn’t to claim that the full cause of the declines lies with Mr. Obama’s policies— former Democrat President Bill Clinton is substantially responsible for the catastrophic deregulation that led to financial calamity. That Mr. Obama appointed the chief architects of this calamity, Timothy Geithner and Larry Summers, to head the bailout efforts for his administration goes far in explaining why only bankers and the very rich have benefited from his policies. Sources: St. Louis Fed and Russell Sage Foundation.

The issues around the ACA (Affordable Care Act) are just as pernicious but in many ways of more direct relevance to most people. A recent article in the New York Times made the point that there is a large ‘learning curve’ to effectively navigate Obamacare coverage. The article focused on Philadelphia but the issues raised are relevant nationwide. For the uninitiated, Pennsylvania is one of twenty-four states that declined to expand Medicaid coverage under Obamacare. Remember: Obamacare was sold as the ‘politically feasible’ alternative to (national) single-payer health care. With Medicaid being the only health care program available to the poor and very poor, the refusal of half of the states to expand the program suggests that settling for the ‘political feasible’ choice was a very poor political calculation if expanding access to health care was the goal. The practical impact is that a substantial portion of poor and near poor across the country will see no increased access to health care under Obamacare.

The ‘learning curve’ at issue is the ins and outs of Obamacare insurance coverage— premiums, subsidies, co-payments, deductibles and out-of-pocket expenses, along with the intricacies of health care networks where health care providers may deal with hundreds of different insurers and know very little of what is covered by any specific insurance plan. In an effort to minimize costs insurers are creating ‘narrow networks’ that limit who provides covered health care— e.g. specific hospitals, doctors and labs. For example: if (non-emergency) surgery is needed the person having the surgery must make sure that the hospital where the surgery is to take place is ‘in network,’ that all doctors involved in any aspect of the surgery are in network, that all diagnostic tests are done through ‘in network’ labs and that all drugs prescribed are ‘approved’ within the network. Failure to know any and all of these details and to make sure that everyone involved— hospital employees, doctors, nurses and administrators, both understand and act on policy limitations, will result in bills for medical services that the people Obamacare was nominally designed to serve can’t afford to pay. Additionally, if you become sick and lose your income you either go on Medicaid if you live in one of the twenty-six states that expanded Medicaid coverage or you are on your own—no matter how many years you have been paying insurance premiums for.

Here Wall Street and Obamacare start to come together. The issue of the complexification of health care, forcing people to know and to competently navigate every aspect of insurance contracts, medical consultation and health care provision or suffer adverse consequences, is related to Wall Street strategies of issuing mortgages that only those with a Ph.D. in math and a lot of time to waste on contractual minutiae can understand. The variable rate mortgages of the housing boom / bust were sold as ‘affordability products,’ as an accommodation to borrowers for their (the borrower’s) benefit. What they were is age-old predatory lending. The most complicated mortgages were issued to the least sophisticated borrowers. In the case of Obamacare, complexification works in the interests of insurers. The more difficult it is for the insured to know what costs they are ‘responsible’ for the easier it is for insurers to force the costs of health care onto them. And even if one assumes honest motives, forcing people to devote their lives to the minutiae of health insurance policies is a uniquely American form of torture.

The delusional premise of Obamacare is that making health care less costly will make it affordable. This has been the Republican fantasy behind health insurance ‘vouchers’ for the last three decades— give everyone a three-hundred dollar tax break to buy insurance and everyone will have health care. Obamacare is set up to give the poorest bottom-half of the country a choice between buying food, paying the rent and ‘buying’ health care. This ‘better than nothing’ approach dissuades people from getting health care until they have no other choice. Decades of experience from actual health care systems suggests that health care— keeping people healthy, is socially and economically less costly than treating people once they are sick. The second-order fantasy at work is that individuals can control health care costs by selecting health insurers that in turn select competent low cost health care providers. The amount of information needed to make the informed choice between policies that might actually accomplish this is beyond the ability of everyone likely to be touched by Obamacare. (Quick: what is the probability that an in network anesthesiologist will be available on any given day? Congratulations, you are one ten-thousandth of your way to making a decision).

Democrat shill and mainstream economist Paul Krugman argues that California, where Medicaid was expanded under Obamacare, points the way toward a single payer health care system. The first problem with this is that Medicaid is a program of minimal health care provision for the very poor— any effort to conflate Medicaid with the functioning health care systems of other ‘developed’ countries is as pathetic as it is disingenuous. The second problem is that for all of the theorized political feasibility of Obamacare half of the states have refused its most important element— Medicaid expansion, meaning that unless these state governments quickly change their minds Mr. Obama’s ‘pragmatic’ compromise looks a lot like what it is widely perceived to be— a cynical sell-out to the sick-care industry. Speculation that there will be a more propitious time to implement real health care reform— single payer, than when Mr. Obama first took office and Democrats held both houses of Congress, derives from the same failed ‘pragmatics’ that now leaves half of the states without Medicaid expansion. The third problem is that unless complexity is resolved people are going to despise Obamacare once they realize that they must devote their lives to insurance company minutiae to get the health care they are now being forced to pay health insurance premiums for. If Mr. Krugman, or any other Democrat Party shill, really wants to sell the idea that ‘Medicaid for all’ is the way forward let them say so clearly so that we know what the Democrat plan really is.

The evidence for the success of Obamacare put forward to date, that a government mandate to make uninsured people buy health insurance has led uninsured people to buy health insurance, illustrates how empty public policy in the U.S. really is. The only relevant metric is whether or not we have a functioning health care system that serves all people, regardless of their ability to pay— or a very clear path in that direction. By this measure, with the most expensive health care system in the ‘developed’ world by a factor of two, substandard outcomes and half of the states rejecting Medicaid expansion, Obamacare has at best added complexity around the margins—it is an entire functioning health care system away from being successful. What will add to this dysfunction, and possibly to a social tipping point, will be the economic effects of the next inevitable financial crisis. As President, Barack Obama has been a continuation of Carter-Reagan-Bush-Clinton-Bush bureaucrats of empire. They have all been successful in serving their constituencies. Maybe it’s ‘progress’ that a black man now has the job. But we aren’t his constituents. Wall Street and the sick-care corporations are. If Democrat apologists want to point to political successes, they should at least point to the right successes.

Rob Urie is an artist and political economist. His book Zen Economics is forthcoming.

 

http://www.counterpunch.org/2014/08/08/obama-and-the-revival-of-wall-street/

Are conservatives ‘hardwired’ to perceive threats?

Research with emotion-generating images suggest that liberals and conservatives are hardwired to see the world differently.

In a new study, researchers suggest that liberals and conservatives may disagree about politics partly because they are different people at the core—right down to their physiology and genetics.

John Alford, an associate professor of political science at Rice University and the study’s coauthor, says the research suggests that biology—not reason or the careful consideration of facts—predisposes people to see and understand the world in different ways.

“These natural tendencies to perceive the physical world in different ways may in turn be responsible for striking moments of political and ideological conflict throughout history,” Alford says.

“Conservatives are fond of saying that ‘liberals just don’t get it,’ and liberals are convinced that conservatives magnify threats—systematic evidence suggests both are correct,” says John Hibbing, lead author and professor of political science at the University of Nebraska–Lincoln.

Responding to images

The study found that compared to liberals, conservatives tend to register greater negative physiological responses to stimuli and devote more psychological resources to them.

For example, when study participants were shown pictures from the International Affective Picture System (IAPS, a database of pictures used to elicit a range of emotions), eye-tracking revealed that conservative respondents fixed their gaze on negative images more rapidly (less than .8 seconds for conservatives versus nearly 1.3 seconds for liberals) and stayed focused on negative images longer (about 2.8 seconds for conservatives versus about 1.9 seconds for liberals).

Alford notes that when eyes process images, they move so quickly that their movement is measured in microseconds. He explains that most eyes are still moving even as they appear to be fixated on an image.

“The fact that conservatives focused on images for almost an entire second longer than liberals represents a significant difference,” Alford says.

The study included a random sample of 340 US adults over the age of 18 from Eastern Nebraska (54 percent female, mean age 45, 55 percent having at least some college experience). The sample included roughly equal numbers of politically conservative, liberal, and moderate individuals as measured by a five-point self-identification scale.

Subsets of these people, chosen to focus specifically on self-described liberals and conservatives, were shown a series of pre-rated IAPS images, some that had been rated as aversive and some that had been rated as appetitive, and measurements were made of their eye movements and physiological responses.

The paper also drew on dozens of pieces of research from experts focusing on the biology, neuroscience, and genetics of political ideology.

Different worlds?

Alford notes that bringing previous studies together with new research paints a bigger picture about how biology impacts political ideology.

“A recurring feature of human history is right-left political battles between adversaries who see a very different world,” Alford says.

“Empirical evidence is increasingly documenting the psychological and physiological differences across people that can lead them to perceive the world so differently. For example, one person focuses on threats, but when facing the same situation, another person focuses on opportunities.”

Alford says it is not surprising that these different visions of reality lead to fundamentally different sets of political preferences.

“We see the ‘negativity bias’ as a common finding that emerges from a large body of empirical studies done not just by us, but by many other research teams around the world,” says Kevin Smith, coauthor and professor and chair of political science at the University of Nebraska–Lincoln.

“We make the case in this article that negative bias clearly and consistently separates liberals from conservatives.

“By documenting that political differences are not necessarily traceable to misinformation or ignorance on the part of one side or the other, scientific understanding of the broader and deeper bases of political diversity may make it possible for Emerson’s forces of tradition and innovation to live together, if not more profitably, at least less violently.”

The study, funded in part by the National Science Foundation, appears in Behavioral and Brain Sciences.

Source: Rice University

 

http://www.futurity.org/conservatives-liberals-threat-743552/?utm_source=Futurity+Today&utm_campaign=ee68aa5031-August_7_20148_7_2014&utm_medium=email&utm_term=0_e34e8ee443-ee68aa5031-205990505

Sick of This Market-Driven World? You Should Be



These days, being a deviant from the prevailing wisdom is something to be proud of.

Photo Credit: Zhangyang13576997233/Shutterstock.com

To be at peace with a troubled world: this is not a reasonable aim. It can be achieved only through a disavowal of what surrounds you. To be at peace with yourself within a troubled world: that, by contrast, is an honourable aspiration. This column is for those who feel at odds with life. It calls on you not to be ashamed.

I was prompted to write it by a remarkable book, just published in English, by a Belgian professor of psychoanalysis, Paul Verhaeghe. What About Me? The Struggle for Identity in a Market-Based Society is one of those books that, by making connections between apparently distinct phenomena, permits sudden new insights into what is happening to us and why.

We are social animals, Verhaeghe argues, and our identities are shaped by the norms and values we absorb from other people. Every society defines and shapes its own normality – and its own abnormality – according to dominant narratives, and seeks either to make people comply or to exclude them if they don’t.

Today the dominant narrative is that of market fundamentalism, widely known in Europe as neoliberalism. The story it tells is that the market can resolve almost all social, economic and political problems. The less the state regulates and taxes us, the better off we will be. Public services should be privatised, public spending should be cut, and business should be freed from social control. In countries such as the UK and the US, this story has shaped our norms and values for around 35 years: since Thatcher and Reagan came to power. It is rapidly colonising the rest of the world.

Verhaeghe points out that neoliberalism draws on the ancient Greek idea that our ethics are innate (and governed by a state of nature it calls the market) and on the Christian idea that humankind is inherently selfish and acquisitive. Rather than seeking to suppress these characteristics, neoliberalism celebrates them: it claims that unrestricted competition, driven by self-interest, leads to innovation and economic growth, enhancing the welfare of all.

At the heart of this story is the notion of merit. Untrammelled competition rewards people who have talent, work hard, and innovate. It breaks down hierarchies and creates a world of opportunity and mobility.

The reality is rather different. Even at the beginning of the process, when markets are first deregulated, we do not start with equal opportunities. Some people are a long way down the track before the starting gun is fired. This is how the Russian oligarchs managed to acquire such wealth when the Soviet Union broke up. They weren’t, on the whole, the most talented, hardworking or innovative people, but those with the fewest scruples, the most thugs, and the best contacts – often in the KGB.

Even when outcomes are based on talent and hard work, they don’t stay that way for long. Once the first generation of liberated entrepreneurs has made its money, the initial meritocracy is replaced by a new elite, which insulates its children from competition by inheritance and the best education money can buy. Where market fundamentalism has been most fiercely applied – in countries like the US and UK – social mobility has greatly declined.

If neoliberalism was anything other than a self-serving con, whose gurus and thinktanks were financed from the beginning by some of the world’s richest people (the US multimillionaires Coors, Olin, Scaife, Pew and others), its apostles would have demanded, as a precondition for a society based on merit, that no one should start life with the unfair advantage of inherited wealth or economically determined education. But they never believed in their own doctrine. Enterprise, as a result, quickly gave way to rent.

All this is ignored, and success or failure in the market economy are ascribed solely to the efforts of the individual. The rich are the new righteous; the poor are the new deviants, who have failed both economically and morally and are now classified as social parasites.

The market was meant to emancipate us, offering autonomy and freedom. Instead it has delivered atomisation and loneliness.

The workplace has been overwhelmed by a mad, Kafkaesque infrastructure of assessments, monitoring, measuring, surveillance and audits, centrally directed and rigidly planned, whose purpose is to reward the winners and punish the losers. It destroys autonomy, enterprise, innovation and loyalty, and breeds frustration, envy and fear. Through a magnificent paradox, it has led to the revival of a grand old Soviet tradition known in Russian as tufta. It means falsification of statistics to meet the diktats of unaccountable power.

The same forces afflict those who can’t find work. They must now contend, alongside the other humiliations of unemployment, with a whole new level of snooping and monitoring. All this, Verhaeghe points out, is fundamental to the neoliberal model, which everywhere insists on comparison, evaluation and quantification. We find ourselves technically free but powerless. Whether in work or out of work, we must live by the same rules or perish. All the major political parties promote them, so we have no political power either. In the name of autonomy and freedom we have ended up controlled by a grinding, faceless bureaucracy.

These shifts have been accompanied, Verhaeghe writes, by a spectacular rise in certain psychiatric conditions: self-harm, eating disorders, depression and personality disorders.

Of the personality disorders, the most common are performance anxiety and social phobia: both of which reflect a fear of other people, who are perceived as both evaluators and competitors – the only roles for society that market fundamentalism admits. Depression and loneliness plague us.

The infantilising diktats of the workplace destroy our self-respect. Those who end up at the bottom of the pile are assailed by guilt and shame. The self-attribution fallacy cuts both ways: just as we congratulate ourselves for our success, we blame ourselves for our failure, even if we have little to do with it.

So, if you don’t fit in, if you feel at odds with the world, if your identity is troubled and frayed, if you feel lost and ashamed – it could be because you have retained the human values you were supposed to have discarded. You are a deviant. Be proud.

George Monbiot is the author Heat: How to Stop the Planet from Burning. Read more of his writings at Monbiot.com. This article originally appeared in the Guardian.

http://www.alternet.org/economy/sick-market-driven-world-you-should-be?paging=off&current_page=1#bookmark