Darwin’s Battle with Anxiety

Charles Darwin was undoubtedly among the most significant thinkers humanity has ever produced. But he was also a man of peculiar mental habits, from his stringent daily routine to his despairingly despondent moods to his obsessive list of the pros and cons of marriage. Those, it turns out, may have been simply Darwin’s best adaptation strategy for controlling a malady that dominated his life, the same one that afflicted Vincent van Gogh – a chronic anxiety, which rendered him among the legions of great minds evidencing the relationship between creativity and mental illness.

In My Age of Anxiety: Fear, Hope, Dread, and the Search for Peace of Mind (public library) – his sweeping mental health memoir, exploring our culture of anxiety and its costsThe Atlantic editor Scott Stossel examines Darwin’s prolific diaries and letters, proposing that the reason the great scientist spent a good third of his waking hours on the Beagle in bed or sick, as well as the cause of his lifelong laundry list of medical symptoms, was his struggle with anxiety.

Stossel writes:

Observers going back to Aristotle have noted that nervous dyspepsia and intellectual accomplishment often go hand in hand. Sigmund Freud’s trip to the United States in 1909, which introduced psychoanalysis to this country, was marred (as he would later frequently complain) by his nervous stomach and bouts of diarrhea. Many of the letters between William and Henry James, first-class neurotics both, consist mainly of the exchange of various remedies for their stomach trouble.

But for debilitating nervous stomach complaints, nothing compares to that which afflicted poor Charles Darwin, who spent decades of his life prostrated by his upset stomach.

That affliction of afflictions, Stossel argues, was Darwin’s overpowering anxiety – something that might explain why his influential studies of human emotion were of such intense interest to him. Stossel points to a “Diary of Health” that the scientist kept for six years between the ages of 40 and 46 at the urging of his physician. He filled dozens of pages with complaints like “chronic fatigue, severe stomach pain and flatulence, frequent vomiting, dizziness (‘swimming head,’ as Darwin described it), trembling, insomnia, rashes, eczema, boils, heart palpitations and pain, and melancholy.”

In 1865 – six years after the completion of The Origin of Species – a distraught 56-year-old Darwin wrote a letter to another physician, John Chapman, outlining the multitude of symptoms that had bedeviled him for decades:

For 25 years extreme spasmodic daily & nightly flatulence: occasional vomiting, on two occasions prolonged during months. Vomiting preceded by shivering, hysterical crying[,] dying sensations or half-faint. & copious very palid urine. Now vomiting & every passage of flatulence preceded by ringing of ears, treading on air & vision …. Nervousness when E leaves me.

“E” refers to his wife Emma, who loved Darwin dearly and who mothered his ten children – a context in which his “nervousness” does suggest anxiety’s characteristic tendency to wring worries out of unlikely scenarios, not to mention being direct evidence of the very term “separation anxiety.”

Illustration from The Smithsonian’s Darwin: A Graphic Biography

Stossel chronicles Darwin’s descent:

Darwin was frustrated that dozens of physicians, beginning with his own father, had failed to cure him. By the time he wrote to Dr. Chapman, Darwin had spent most of the past three decades – during which time he’d struggled heroically to write On the Origin of Species housebound by general invalidism. Based on his diaries and letters, it’s fair to say he spent a full third of his daytime hours since the age of twenty-eight either vomiting or lying in bed.

Chapman had treated many prominent Victorian intellectuals who were “knocked up” with anxiety at one time or another; he specialized in, as he put it, those high-strung neurotics “whose minds are highly cultivated and developed, and often complicated, modified, and dominated by subtle psychical conflicts, whose intensity and bearing on the physical malady it is difficult to comprehend.” He prescribed the application of ice to the spinal cord for almost all diseases of nervous origin.

Chapman came out to Darwin’s country estate in late May 1865, and Darwin spent several hours each day over the next several months encased in ice; he composed crucial sections of The Variation of Animals and Plants Under Domestication with ice bags packed around his spine.

The treatment didn’t work. The “incessant vomiting” continued. So while Darwin and his family enjoyed Chapman’s company (“We liked Dr. Chapman so very much we were quite sorry the ice failed for his sake as well as ours” Darwin’s wife wrote), by July they had abandoned the treatment and sent the doctor back to London.

Chapman was not the first doctor to fail to cure Darwin, and he would not be the last. To read Darwin’s diaries and correspondence is to marvel at the more or less constant debilitation he endured after he returned from the famous voyage of the Beagle in 1836. The medical debate about what, exactly, was wrong with Darwin has raged for 150 years. The list proposed during his life and after his death is long: amoebic infection, appendicitis, duodenal ulcer, peptic ulcer, migraines, chronic cholecystitis, “smouldering hepatitis,” malaria, catarrhal dyspepsia, arsenic poisoning, porphyria, narcolepsy, “diabetogenic hyper-insulism,” gout, “suppressed gout,” chronic brucellosis (endemic to Argentina, which the Beagle had visited), Chagas’ disease (possibly contracted from a bug bite in Argentina), allergic reactions to the pigeons he worked with, complications from the protracted seasickness he experienced on the Beagle, and ‘refractive anomaly of the eyes.’ I’ve just read an article, “Darwin’s Illness Revealed,” published in a British academic journal in 2005, that attributes Darwin’s ailments to lactose intolerance.

Various competing hypotheses attempted to diagnose Darwin, both during his lifetime and after. But Stossel argues that “a careful reading of Darwin’s life suggests that the precipitating factor in every one of his most acute attacks of illness was anxiety.” His greatest rebuttal to other medical theories is a seemingly simple, positively profound piece of evidence:

When Darwin would stop working and go walking or riding in the Scottish Highlands or North Wales, his health would be restored.

(Of course, one need not suffer from debilitating anxiety in order to reap the physical and mental benefits of walking, arguably one of the simplest yet most rewarding forms of psychic restoration and a powerful catalyst for creativity.)

My Age of Anxiety is a fascinating read in its totality. Complement it with a timeless antidote to anxiety from Alan Watts, then revisit Darwin’s brighter side with his beautiful reflections on family, work, and happiness.

http://www.brainpickings.org/index.php/2014/08/28/darwin-anxiety/

Cancer, Politics and Capitalism

http://www.verbo10.com/wp-content/uploads/crisis_capitalista2.jpg

by LOUIS PROYECT

After working for a series of unsavory financial institutions for 15 years, I accepted a position as a database administrator at Memorial Sloan-Kettering Cancer Center (MSKCC) in 1983 with an eager sense of anticipation. Finally I would be doing something professionally that was more in sync with my political values. Instead of using my skills to keep track of pension trust portfolios, I would be creating a data infrastructure for patient care.

For more than a year I worked on developing a data model based on “normalized” relationships that sought to eliminate redundancies and provide a reliable foundation for applications development. A few months after I presented the model to management, I learned that all my work was in vain. The hospital had decided to buy a package from SMS, inc. that was considered nonpareil when it came to debt collection. As happened too often, a loved one would check into the hospital for a couple of months of very expensive and painful treatments that came to an end with the patient’s death. Since the survivors often had a tendency to ignore the astronomical bills that went along with such an exercise in futility, the hospital decided to purchase a system that was very good at dunning if nothing else. That decision left me feeling deflated. Once again money ruled.

When I received an invitation to review “Second Opinion: Laetrile at Sloan-Kettering”, a documentary described as “the remarkable true story of a young science-writer at Memorial Sloan-Kettering Cancer Center, who risked everything by blowing the whistle on a massive cover-up involving a promising cancer therapy”, I knew that this was one I could not miss. (The film opens at Cinema Village in NYC on August 29, and at Laemmle Music Hall in LA on September 5. A national release will follow.)

Directed by Eric Merola, the film is made up primarily of Ralph W. Moss, the aforementioned young science-writer now 71, describing the events that took place when he was working at MSKCC in the mid-70s filled with the same sense of idealism I brought with me 8 years later. Like me, Moss was soon disillusioned but for another set of reasons.

Although I wasn’t aware of it at the time, my first encounter with Moss was when I worked at MSKCC, through the intermediary of a book he wrote titled “The Cancer Industry”. As I noted in a May 2012 article about MSKCC’s purchase of the SMS software, Moss’s book was a good introduction to the slimy realities of cancer care under capitalism:

When I was working at Sloan-Kettering, I read a terrific book titled “The Cancer Industry” that along with “The Cancer Wars” is essential reading for those with a class analysis. To this day, I remember what the book said about Hubert Humphrey’s stay at Sloan-Kettering. I don’t have the book handy but these paragraphs from a 1990 review should suffice:

Among the horrors stories in The Cancer Industry is the case history of Senator Hubert Humphrey, who was operated on by a team of surgeons at Memorial Sloan-Kettering on October 6, 1976. His surgeon appeared before the press and television cameras to announce that the senator was cured by the operation, but as a preventive measure, to “wipe out any microscopic colonies of cancer cells that may be hidden in the body, treatment would begin with experimental drugs.” Moss describes the aftermath:

“Within about a year, Senator Humphrey was dead. In that short time he had withered from a vigorous middle-aged man to an old, balding and feeble cancer victim. Humphrey himself blamed chemotherapy … calling it `bottled death’ and refusing in the end to return to Memorial Hospital for drug treatment.”

Hired to work in the PR department for his writing ability and enthusiasm, Moss had an unsettling introduction. On the very week he started, he was shocked to discover that one of the hospital’s top researchers had been caught perpetrating a major fraud. After announcing to the world that he had completed an experiment that successfully transplanted skin from a black mouse to a white mouse, William Summerlin became an MSKCC super-star. Since the mice were of different species and the transplant had not been rejected, this could lead to major breakthroughs in human organ transplants. When a lab assistant discovered quite by accident that Summerlin had simply used a black magic marker to draw a patch on a white mouse’s back, the hospital looked incompetent. Lewis Thomas, the hospital’s director, explained the incident as one caused by Summerlin’s “severe emotional disturbance”. I would have called it greed.

(I should add that Lewis Thomas is not one of my favorite people. He is the author of an essay titled “The Iks” that makes the case that this hunting and gathering society living in the Ugandan wilderness were “an irreversibly disagreeable collection of unattached, brutish creatures, totally selfish and loveless.” Remembering Thomas’s essay from high school, filmmaker Cevin Soling traveled to Ikland to find out for himself whether this was true. Suffice it to say that Thomas’s essay was a bogus as Summerlin’s painted mouse.)

Around the same time, a more serious experiment was taking place at MSKCC. An octogenarian Japanese scientist named Kanematsu Sugiura, who had published 250 papers in a distinguished career, had begun treating mice with laetrile. The substance, also called amygdalin, was extracted from apricot pits. His findings: the mice that received laetrile benefited from injections. They were not cured of cancer but were able to live longer than non-treated animals. Most importantly, the tumors did not metastasize in the treated mice. Trained to be cautious, Sugiura thought the drug had palliative value. The implication, needless to say, was that further research was needed.

As word of Sugiura’s experiments filtered up to the MSKCC brass, they assigned Moss to cover them from a PR angle but just as much to snoop on the senior researcher. Not only did Moss fail to detect any irregularities, he became upset when he learned that the hospital had become convinced that Sugiura’s experiments were flawed and that research should be abandoned.

Convinced that he needed outside help to make the case for Sugiura’s experiments, Moss hooked up with Science for the People, a radical group that came out of the 1960s student movement. Working with a physician-activist named Alec Pruchnicki and without the knowledge of his superiors, Moss began publishing a newsletter called Second Opinion that was distributed outside of MSKCC just like most agitprop was in pre-Internet days. The newsletter soon became a sounding board for every kind of grievance at the hospital, including working conditions and patient treatment.

When the hospital called a press conference to disassociate itself from laetrile, Sugiura said that he stood by the hospital’s decision as well as his own findings. When asked by reporters how he could hold mutually opposed positions, he handled himself gracefully while standing his ground.

In a fascinating Science magazine article from December 23, 1977, Nicholas Wade—the NY Times reporter whose recent book on genetic inheritance most critics regard as a painted mouse—was loath to get on the anti-laetrile bandwagon despite the newspaper’s strong agreement with the top brass’s dismissal of Sugiura’s findings. Wade quotes Robert Good, the top immunologist at MSKCC: “If we had published those early positive data, it would have caused all kinds of havoc. The natural processes of science are just not possible in this kind of pressure cooker.”

In a phone conversation with Ralph Moss shortly after I viewed the film, I was struck by his unwillingness to assume the stance of a pro-laetrile activist even though he was obviously convinced that Sugiura’s experiments were valid. The film is making the case for considering alternatives to costly and often toxic medications that are making big pharma rich. He mentioned Avastin, a drug that generated $2.11 billion in sales in 2011. That, he added, was more than the GDP of many third world countries. The spirit of Science for the People continues in the work of Ralph W. Moss. See this film for a riveting account of the conflicts between corporate power and the public good.

Several weeks before I watched “Second Opinion”, I made a point of reading George Johnson’s recently published The Cancer Chronicles in order to get up to speed on current thinking about the disease. As I mentioned above, when I worked at MSKCC, I read Samuel Epstein’s “The Politics of Cancer”, a book that ties what was perceived at the time as a cancer epidemic to environmental toxins, especially pesticides. It was very much in the spirit of Barry Commoner’s “The Closing Circle” and amenable to my Marxist opposition to corporate indifference to our health and safety.

About ten years after reading “The Politics of Cancer”, I read Robert Proctor’s “The Cancer Wars” that backtracked from Epstein’s findings. Although very much a man of the left, Proctor warned his readers that finding a direct correlation between pollutants and cancer is very difficult.

With Proctor’s warnings in the back of my mind, I was not completely surprised by Johnson’s treatment of the environmental question. In chapter seven, titled “Where Cancer Really Comes From”, Johnson amasses some statistics of the sort that pro-industry hacks might repeat. For example, epidemiology studies conclude that cancer cases in the immediate vicinity of Love Canal were no greater than that in the rest of New York State even though there was a spike in birth defects.

In referring to cancer clusters, such as the supposed breast cancer epidemic in Long Island, Johnson concludes that they are “statistical illusions”. It is not so much that Johnson denies that there is a connection between cancer and the environment; it is that they are exceedingly difficult to prove.

Since I have like most people on the left become convinced that there is a connection between carcinogens in the water, soil and air and the incidence of cancer, I emailed Johnson with my concerns and referred him to a study of cancer clusters near heavily polluted rivers in China. Showing a grace uncommon to most well-established journalists, Johnson took the trouble to write back:

Thanks very much for your email. I appreciate the kind words about my book. I hadn’t seen that particular study and will make a point of reading it. Of course many industrial chemicals are carcinogenic, and it seems very possible that concentrations have been high and chronic enough in China’s water to expose the general population to levels known to cause cancer in the workplace. Nailing that down is very tricky though, especially in developing countries where epidemiological studies are just getting underway. Most of the research in China seems to concentrate on air pollution and lung cancer. Since the focus of my book was on cancer in the developed world, I may write a column in the future comparing the situation with China, India, etc.

Making the case about pollution—a negative indicator—is difficult but just as much so with positive indicators. Nutritionists are always urging us to eat fruits and vegetables, especially those with anti-oxidant properties such as blueberries and cabbage but there has never been a rigorous study of diet and cancer. This has a lot to do with the near impossibility of conducting a demographically representative study of the effects of eating “good” food and bad. Since cancer can take many decades to show up, tracking its roots and development is a near impossible task.

“The Cancer Chronicles” was motivated in part by his wife’s illness. Showing the difficulty of establishing a unilinear connection between diet and the disease, Nancy Johnson was something of a health nut given to daily exercise and a large intake of the very anti-oxidant fruits and vegetables nutritionists advise. Chapter four begins:

She always ate her vegetables. Obsessively, it sometimes seemed. Breakfast, lunch, dinner, throughout the day she would keep mental count. Never mind if it was 10:30 p.m., halfway through a Simpsons episode or a DVD. If she hadn’t consumed two or three servings of vegetables (some green, some yellow) and three or four servings of fruits, nuts, grains—whatever the food pyramidologists were recommending—she would slice up an apple or open a bag of carrots.

Confronted by the sheer anomaly of a person with such a lifestyle being susceptible to cancer, Johnson sets out on a trek that takes him to conferences and labs all around the USA when he is not accompanying his wife on her frequent chemotherapy sessions. His goal was to understand the basic biology of humanity’s oldest disease.

Indeed, it is not just ours. The dinosaurs suffered from cancer as well. In a trip to western Colorado, Johnson visits the site where six tons of Brontosaurus bones were discovered in 1901, including one that was the oldest one known to have contained a tumor. Using prose that has been polished over a long and distinguished career in science journalism, he reports on what he saw:

Viewed head-on, the fossil measured 6.5 by 9.5 inches. Lodged inside its core was an intrusion, now crystallized, that had grown so large it had encroached into the outer bone. Bunge [a museum curator] suspected osteosarcoma—he had seen the damage the cancer can do to human skeletons, particularly those of children. Oval in shape and the size of a slightly squashed softball, the tumor had been converted over the millennia into agate.

Johnson’s book is one of the finest on science that I have read in a very long time, perhaps in my life. As I told Marxmail subscribers, if I had run into such a book when I was in high school, I probably would have majored in biology at Bard College rather than religion (don’t try to get me to explain that choice.)

Johnson’s book ranges from medicine to physics, and from physics to philosophy without missing a beat. At the risk of sounding like one of those people who write the blurbs on book jackets, I would describe “The Cancer Chronicles” as a powerful examination of the biology of the human cell, including those that mutate into the most dreaded disease we face.

Between 2008 and 2012, three men died of cancer all within just about two years of each other. The first to go was Peter Camejo, who was responsible for helping me to understand what went wrong with the Socialist Workers Party. Peter, who succumbed to lymphoma, attributed his illness to pollutants he had been exposed to over a lifetime.

Next to go was Harvey Pekar, the comic book author who persuaded me to work on a memoir with him. Peter Camejo was a character in the memoir, as well as number of other colorful characters I got to know over a lifetime in politics and the bohemian underground. Like Peter, Harvey died from lymphoma or at least a system weakened by the disease.

Finally, two years later, I learned of Alexander Cockburn’s death. Alexander was a kind of bookend to Peter. When I quit the Trotskyist movement in 1979, I intended to put politics behind me and return to the bohemia of my youth. In an attempt to keep up with the NYC underground, I began reading the Village Voice. But the only writing that made any kind of impression on me was Alexander Cockburn’s weekly columns that lacerated the high and mighty. It was his writing that moved me to return to politics, the only damned thing I am good at.

The more our lives become entwined with the Internet, and social media in particular, the closer we become to people even if we never meet in person. Over the past few years, I have been at the virtual bedside of two people who I have enormous respect for. Using Facebook for both support and ventilation, Ed Douglas and Kristin Kolb have kept their friends abreast on their encounters with life-threatening illnesses. Additionally both were able to raise funds through the Internet, a necessity given the lack of adequate health care in the USA. Ed, a founding member of New York Film Critics Online—the group I have been part of for 15 years, developed an acute case of leukemia some years ago that ultimately required a bone marrow transplant. Fortunately he is in remission now and doing well. Kristin, a CounterPunch contributor of great distinction, is going through the final stages of chemotherapy for breast cancer. We who contribute to and read CounterPunch offer our support for her getting past this ordeal.

If the origins of cancer and its ultimate cure are shrouded in mystery, the same cannot be said about the need for adequate and affordable care. If it were not for the generosity of Ed and Kristin’s friends and admirers, their road would have been a lot more difficult.

Mike Marqusee, another long-time CounterPunch contributor, made the wise choice to relocate to Britain in 1971 where health care is free.

Around the time I started reading Johnson’s “The Cancer Chronicles”, I learned that Marqusee has been dealing with multiple myeloma for a number of years. He wrote a book recently that touches on his illness as well as Britain’s socialized medicine. Available from OR Books, “The Price of Experience: Writings on Living with Cancer” is both a personal history as well as a sharp-eyed analysis of the benefits of socialized medicine—as one would expect from a long-time Marxist.

You will notice that just above I refer to Marqusee “dealing” with cancer rather than the hackneyed term “battling”. As might be expected from an antiwar activist (Marqusee was on the steering committee of the Stop the War Coalition in Britain), Marqusee has little use for military metaphors. He writes:

Obituaries routinely inform us that so-and-so has died “after a brave battle against cancer.” Of course, we will never read that so-and-so has died “after a pathetically feeble battle against cancer.” But one thing that I have come to appreciate since being diagnosed with multiple myeloma (a cancer of the blood) two years ago is how unreal both notions are. It’s just not like that.

The emphasis on cancer patients’ “bravery” and “courage” implies that if you can’t “conquer” your cancer, there’s something wrong with you, some weakness or flaw. If your cancer progresses rapidly, is it your fault? Does it reflect some failure of will-power?

Like one of the characters in Michael Moore’s “Sicko” who lives a country not befouled by big pharma and the insurance industry, Marqusee describes a system that is geared to human need rather than private profit. For all the years he has been receiving treatment at Barts, the nickname for St. Bartholomew’s, a London hospital founded in 1123 (!), he has never had to pay a penny. Despite the fact that it is free, the treatment has been equal to some of the premiere hospitals in the USA.

But the same forces that have imposed Obamacare on us are conspiring to privatize and/or reduce the level of treatment in Britain. Showing the same sense of worker and patient solidarity that Ralph Moss’s newsletter sought to imbue at MSKCC nearly 40 years ago, Marqusee writes and we conclude:

I hope staff at Barts resist this attack on their jobs, and on the essential, life-sustaining services they provide. It’s often seemed to me that Barts survives on their good will alone. They’ve already been hammered by a steady fall in real wages, and there is a sad fatalism among most, not helped by the patchiness of the union presence across the Trust. What’s vital is that they understand that what’s happening now is not about failings at Barts; it’s a manifestation of the general crisis in the NHS, a crisis brought about by cuts, fragmentation, and privatisation, and one that can only be addressed through a mass movement that forces a radical redirection in government policy.

Louis Proyect blogs at http://louisproyect.org and is the moderator of the Marxism mailing list. In his spare time, he reviews films for CounterPunch.

 

 

http://www.counterpunch.org/2014/08/29/cancer-politics-and-capitalism/

One-third of the US population has no retirement savings

http://financialjuneteenth.com/wp-content/uploads/2014/04/retirement.jpg

By Jake Dean

22 August 2014

A new survey from Bankrate.com, accompanied by Bankrate’s Financial Security Index, conducted by the Princeton Survey Research Associates International, underlines the deplorable economic conditions facing millions of Americans. More than one-third (36 percent) of the US population has no savings for retirement.

The Financial Security Index breaks down the retirement savings for the different age groups. For ages between 19 to 29, at the start of their working lives, it might be expected that a higher percentage, in this case 69 percent, have no retirement saving. From ages 30 to 49, however, 33 percent have no savings, and for ages 50 to 64, 26 percent still have no savings. The findings are most disturbing for adults aged 65 and older, where 14 percent have no retirement savings at all.

The survey measures how Americans feel about their personal finances in comparison from one year ago. Below are some of the significant findings highlighted in the report:

· 31 percent of parents said they are less comfortable with debt compared to 21 percent of nonparents who are comfortable

· 32 percent say they feel less comfortable with their savings compared to 16 percent who assert that they are comfortable

· College graduates are more than twice as likely to say that they are comfortable with their savings than those who never attended college

· Individuals between the ages of 18 to 29 years old are twice as likely to feel secure than seniors 65 years and older

· Part-time workers are twice as likely compared to full-time workers to have no savings plan

· Individuals living in suburban and rural areas feel twice as less likely to be financially secure than those living in urban areas

While the report notes that the younger generation may feel more secure than those nearing retirement, it is also true that the youth have been told they have no worries along this line. As part of the protracted social counterrevolution, bound up with the cult of competition, youth have been led to believe that any meager full-time job represents financial success, and hence financial security. Although the survey notes that college graduates may feel more secure than those who never attended college, it never addresses the issue of financial or job security, as companies slash wages and hire more part-time workers.

The survey, involving 1,003 participants, exposes the fictitious character of the economic “recovery” that is being hailed by bourgeois media and the Obama administration. The reality is that the economic recovery has only been a recovery for the financial and corporate elite.

Greg McBride, chief financial analyst for Bankrate.com, told USA Today “These numbers are very troubling because the burden for retirement savings is increasingly on us as individuals with each passing day.”

“‘Well, I’ll just work forever,’ is not a viable retirement savings strategy, because you don’t control your own destiny on that. When we look at long-term unemployment, it’s concentrated on adults over age 50,” states McBride.

These findings are indeed very revealing. The survey essentially reports that 65 is no longer seen as the threshold age for retirement, but the new norm is work till you die or until you are physically unable to do so.

While the solution proposed by McBride is to “Save more and save it now,” what exactly is there to save? Several reports within the past year have confirmed the daunting reality of Bankrate’s report that workers are unable to save anything for retirement, as there is nothing they can save.

A report released earlier this year from the Federal Reserve, the US central bank, entitled Report on the Economic Well-Being of U.S. Households in 2013, has revealed that a typical American household cannot raise $400 without borrowing money or selling possessions.

According to the report, nearly two-thirds of those under 45 are unable to set aside funds to cover their expenses for a three-month period. Seventy percent of the respondents in the survey have stated that they were no better off than they were in 2008, when the financial crisis first began.

When the survey examined the population that attended college, one quarter of the respondents held student loan debt averaging $27,840, with 56 percent of the respondents stating that they “believe that the costs of the education outweighed any financial benefits they received from the education.”

Of the two-thirds of households that have any form of savings in 2008, a quarter reported that they used of “some” or “nearly all” of their saving “to pay for bills and expenses.”

Other surveys have confirmed the abysmal conditions facing vast numbers of working people. A nonprofit institute, Employee Benefit Research Institute (EBRI), and Greenwald & Associates found in a survey of 1,000 individuals that 36 percent of the respondents have less than $1,000 in savings and investments that could possibly be used for retirement, and that 60 percent of workers have less than $25,000.

In a separate survey conducted by EBRI, only 18 percent of workers feel that they are confident that they will have enough money to live a comfortable life during retirement, forcing thousands to work longer and longer.

Despite the drumbeat of propaganda that seek to convince the unemployed or the indebted that their problems are simply their fault, the facts show otherwise. Jack VanDerhei, EBRI’s research director, said that the two primary factors in the creation of such conditions were the cost of living and day-to-day expenses. That such expenses are unable to allow individuals to keep a meager $1,000 in savings is an indictment of the capitalist system.

The participants in these surveys and findings are part of a broader social layer that is unable to meet even its most basic needs, much less to be able to prepare for the future or retirement.

Report after report all point to the same conclusion: the precarious state of the majority of US households, who risk poverty or bankruptcy in the event of an unforeseen accident or loss of job. What is happening in the US is the deliberate polices of the Obama administration, which has shoveled trillions of dollars to the financial sector with no strings attached or to hold anyone responsible for the 2008 financial crisis, while encouraging companies to slash wages and benefits.

 

New report details depth of hunger crisis in the United States

http://anticap.files.wordpress.com/2009/11/11-20-end-of-hunger-in-us1.jpg?w=439&h=346

By Shannon Jones
19 August 2014

A new study by the non-profit agency Feeding America reveals the extent and depth of the hunger crisis in the United States.

According to the report, “Hunger in America 2014,” about one in seven in the US, 46 million people, rely on food banks in order to feed themselves and their families. The number includes 12 million children and 7 million senior citizens. The findings are based on responses from a survey of 60,000 recipients of food aid from the 200 food banks that are affiliated with Feeding America.

The survey was conducted between April and August 2013 and thus does not reflect the impact of more recent cuts carried out by the Obama administration to the Supplemental Nutrition Assistance Program (SNAP), commonly referred to as food stamps. In February Obama signed legislation slashing $8.7 billion over ten years from the federal food stamp program. It followed cuts in November that slashed food assistance by $319 per year for a typical family of three.

Last year the US government reported that a record number of Americans, more than 47 million, were relying on food stamp benefits. According to Feeding America only 55 percent of those receiving aid through its affiliated food banks are also enrolled in the SNAP program.

The persistence of high levels of food insecurity, more than five years after the economic crash of 2008-2009, demonstrates that talk of an economic recovery is a fraud. In fact hardship is growing among wide layers of the population, including those who are working and attending school.

The numbers presented by the survey suggest that the problem of hunger in America is broader than suggested by the numbers for food stamp enrollment alone, and could embrace as much as 20 percent of the US population.

The Feeding America survey revealed a number of startling facts. According to the report 69 percent of respondents said they have to choose between food and paying utility bills, and 66 percent said they have to choose between food and medical care. Another 31 percent said they had to choose between food and education.

Among those who are having difficulty getting enough food to eat, one of the most common coping strategies is to purchase less healthy, cheaper foods in order to stretch dollars. Such foods, which often contain high levels of sugar and sodium, can contribute to a multitude of health problems including obesity, heart disease and diabetes.

The survey noted the high incidence of individuals with health problems among those seeking food assistance. Nearly 47 percent of those responding said they had fair or poor health. It found that 58 percent of households had a member with high blood pressure and 38 percent had a member with diabetes. Some 29 percent of households reported that no member had access to health insurance, including Medicare and Medicaid.

According to the survey 39 percent of respondent households include at least one child, a higher rate than the general population (32 percent). Six percent reported both children and seniors living in the same household.

Significantly, the study found that 54 percent of households had a least one member who was employed in the past year. The rate is even higher for households with children, 71 percent. In addition, many households reported members with education beyond high school, including some with two- and four-year college degrees. About 21 percent reported attending or graduating from college.

Some 43 percent of Feeding American clients are white. Twenty-six percent are African American and 20 percent Latino. Nearly 15 percent of client households are multi-racial.

Requests for food assistance are up even for those serving in the US military, the employer of last resort for many working-class youth. The study found that almost 620,000 households enrolled in Feeding America programs had at least one family member currently in the US military. That figure amounts to 25 percent of all US military households.

Another striking statistic contained in the report relates to the large numbers of college students seeking food assistance. According to report, 10 percent of adult aid recipients are students, including two million full-time and one million part-time students. About two million Feeding America clients are students attending school full-time.

Burdened with crushing loads of student debt and unable to find decent paying work, many college students are turning to food pantries. In fact more and more colleges and universities have opened food pantries in response to growing problem of food insecurity among students.

Hunger inhibits learning in many ways, by reducing concentration and inhibiting the ability to retain material. Some students even have to forego purchasing course textbooks in order to pay for food.

According to researchers at Oregon State University, 59 percent of the students at Western Oregon University, a liberal arts college located in Monmouth, Oregon, are food insecure. A 2011 survey at the City University of New York found that 39.2 percent of the system’s 250,000 undergraduates had experienced food insecurity at some time in the past year.

Between 2007 and 2011 the number of food stamp recipients holding doctoral degrees tripled, according to a report in the 2012 Chronicle of Higher Education. At Michigan State University the on-campus food pantry reports that more than 50 percent of its clients are graduate students.

The Carnage of Capitalism



Capitalism is expanding like a tumor in the body of American society, spreading further into vital areas of human need like health and education.

Photo Credit: JoeBakal/Shutterstock.com

Capitalism is expanding like a tumor in the body of American society, spreading further into vital areas of human need like health and education.

Milton Friedman said in 1980: “The free market system distributes the fruits of economic progress among all people.” The father of the modern neoliberal movementcouldn’t have been more wrong. Inequality has been growing for 35 years, worsening since the 2008 recession, as a few well-positioned Americans have made millions while the rest of us have gained almost nothing. Now, our college students and medicine-dependent seniors have become the source of new riches for the profitseeking free-marketers.

Higher Education: Administrators Get Most of the Money

College grads took a 19 percent pay cut in the two years after the recession. By 2013 over half of employed black recent college graduates were working in occupations that typically do not require a four-year college degree. For those still in school, tuition has risen much faster than any other living expense, and the average student loan balance has risen 91 percent over the past ten years.

At the other extreme is the winner-take-all free-market version of education, with a steady flow of compensation towards the top. Remarkably, and not coincidentally, as inequality has surged since the 1980s, the number of administrators at private universities has doubled. Administrators now outnumber faculty on every campusacross the country.

These administrators are taking the big money. As detailed by Lawrence Wittner, the 25 highest-paid presidents increased their salaries by a third between 2009 and 2012, to nearly a million dollars each. For every million-dollar public university president in 2011, there were fourteen such presidents at private universities, and dozens of lower-level administrators aspiring to be paid like their bosses. At Purdue, for example, the 2012 administrative ranks included a $313,000-a-year acting provost, a $198,000 chief diversity officer, a $253,000 marketing officer and a $433,000 business school chief.

All this money at the top has to come from somewhere, and that means from faculty and students. Adjunct and student teachers, who made up about 22 percent of instructional staff in 1969, now make up an estimated 76 percent of instructional staff in higher education, with a median wage in 2010 of about $2,700 per course. More administrative money comes from tuition, which has increased by over 1,000 percent since 1978.

At the for-profit colleges, according to a Senate report on 2009 expenses, education companies spent about 23 percent of all revenue on marketing and advertising, and almost 20 percent of revenue on pre-tax profits for their shareholders. They spent just 17.2 percent of their revenue on instruction.

Medicine: A 10,000 Percent Profit for Corporations

As with education, the extremes forced upon us by free-market health care are nearly beyond belief. First, at the human end, 43 percent of sick Americans skipped doctor’s visits and/or medication purchases in 2011 because of excessive costs. It’s estimatedthat over 40,000 Americans die every year because they can’t afford health insurance.

At the corporate end, drugmakers are at times getting up to $100 for every $1 spent. That’s true at Gilead Sciences, the manufacturer of the drug Sovaldi, which charges about $10 a pill to its customers in Egypt, then comes home to charge $1,000 a pill to its American customers. The 10,000 percent profit is also true with the increasingly lucrative, government-funded Human Genome Project, which is estimated to potentially return about $140 for every $1 spent. Big business is quickly making its move. Celera GenomicsAbbott LabsMerckRocheBristol-Myers Squibb, andPfizer are all starting to cash in.

The extremes of capitalist greed are evident in the corporate lobbying of Congress to keep Medicare from negotiating better drug prices for the American consumer. Americans are cheated further when corporations pay off generic drug manufacturers to delay entry of their products into the market, thereby ensuring inflated profits for the big firms for the durations of their shady deals.

Global Greed

Lives are being ravaged by unregulated, free-market capitalism, in the U.S. and around the world. According to the Global Forum for Health Research, less than 10 percent of the global health research budget is spent on the conditions responsible for 90 percent of human disease.

And the greed is getting worse. Perhaps it’s our irrational fear of socialism, peaking in the years after World War 2, that has inspired our winner-take-all culture. In the Reagan era we listened to Margaret Thatcher proclaim that “There is no such thing as society.”

In a more socially-conscious time, in 1955, after Dr. Jonas Salk had developed the polio vaccine, he was asked by reporter Edward R. Murrow: “Who owns the patent on this vaccine?” Responded Salk, “Well, the people, I would say. There is no patent. Could you patent the sun?”

A free-market capitalist might remind us that a skillful hedge fund manager can make as much as a thousand Jonas Salks.

 

Paul Buchheit teaches economic inequality at DePaul University. He is the founder and developer of the Web sites UsAgainstGreed.org, PayUpNow.org and RappingHistory.org, and the editor and main author of “American Wars: Illusions and Realities” (Clarity Press). He can be reached at paul@UsAgainstGreed.org.

http://www.alternet.org/economy/carnage-capitalism?akid=12138.265072.pC6w-o&rd=1&src=newsletter1015885&t=13&paging=off&current_page=1#bookmark

Facebook, email and the neuroscience of always being distracted

I used to be able to read for hours without digital interruption. Now? That’s just funny. I want my focus back!

"War and Peace" tortured me: Facebook, email and the neuroscience of always being distracted
This essay is adapted from “The End of Absence”

I’m enough of a distraction addict that a low-level ambient guilt about not getting my real work done hovers around me for most of the day. And this distractible quality in me pervades every part of my life. The distractions—What am I making for dinner?, Who was that woman in “Fargo”?, or, quite commonly, What else should I be reading?—are invariably things that can wait. What, I wonder, would I be capable of doing if I weren’t constantly worrying about what I ought to be doing?

And who is this frumpy thirty-something man who has tried to read “War and Peace” five times, never making it past the garden gate? I took the tome down from the shelf this morning and frowned again at those sad little dog-ears near the fifty-page mark.

Are the luxuries of time on which deep reading is reliant available to us anymore? Even the attention we deign to give to our distractions, those frissons, is narrowing.

It’s important to note this slippage. As a child, I would read for hours in bed without the possibility of a single digital interruption. Even the phone (which was anchored by wires to the kitchen wall downstairs) was generally mute after dinner. Our two hours of permitted television would come to an end, and I would seek out the solitary refuge of a novel. And deep reading (as opposed to reading a Tumblr feed) was a true refuge. What I liked best about that absorbing act was the fact books became a world unto themselves, one that I (an otherwise powerless kid) had some control over. There was a childish pleasure in holding the mysterious object in my hands; in preparing for the story’s finale by monitoring what Austen called a “tell-tale compression of the pages”; in proceeding through some perfect sequence of plot points that bested by far the awkward happenstance of real life.

The physical book, held, knowable, became a small mental apartment I could have dominion over, something that was alive because of my attention and then lived in me.

But now . . . that thankful retreat, where my child-self could become so lost, seems unavailable to me. Today there is no room in my house, no block in my city, where I am unreachable.

Eventually, if we start giving them a chance, moments of absence reappear, and we can pick them up if we like. One appeared this morning, when my partner flew to Paris. He’ll be gone for two weeks. I’ll miss him, but this is also my big break.



I’ve taken “War and Peace” back down off the shelf. It’s sitting beside my computer as I write these lines—accusatory as some attention-starved pet.

You and me, old friend. You, me, and two weeks. I open the book, I shut the book, and I open the book again. The ink swirls up at me. This is hard. Why is this so hard?

* * *

Dr. Douglas Gentile, a friendly professor at Iowa State University, recently commiserated with me about my pathetic attention span. “It’s me, too, of course,” he said. “When I try to write a paper, I can’t keep from checking my e-mail every five minutes. Even though I know it’s actually making me less productive.” This failing is especially worrying for Gentile because he happens to be one of the world’s leading authorities on the effects of media on the brains of the young. “I know, I know! I know all the research on multitasking. I can tell you absolutely that everyone who thinks they’re good at multitasking is wrong. We know that in fact it’s those who think they’re good at multitasking who are the least productive when they multitask.”

The brain itself is not, whatever we may like to believe, a multitasking device. And that is where our problem begins. Your brain does a certain amount of parallel processing in order to synthesize auditory and visual information into a single understanding of the world around you, but the brain’s attention is itself only a spotlight, capable of shining on one thing at a time. So the very word multitask is a misnomer. There is rapid-shifting minitasking, there is lame-spasms-of-effort-tasking, but there is, alas, no such thing as multitasking. “When we think we’re multitasking,” says Gentile, “we’re actually multiswitching.”

We can hardly blame ourselves for being enraptured by the promise of multitasking, though. Computers—like televisions before them—tap into a very basic brain function called an “orienting response.” Orienting responses served us well in the wilderness of our species’ early years. When the light changes in your peripheral vision, you must look at it because that could be the shadow of something that’s about to eat you. If a twig snaps behind you, ditto. Having evolved in an environment rife with danger and uncertainty, we are hardwired to always default to fast-paced shifts in focus. Orienting responses are the brain’s ever-armed alarm system and cannot be ignored.

Gentile believes it’s time for a renaissance in our understanding of mental health. To begin with, just as we can’t accept our body’s cravings for chocolate cake at face value, neither can we any longer afford to indulge the automatic desires our brains harbor for distraction.

* * *

It’s not merely difficult at first. It’s torture. I slump into the book, reread sentences, entire paragraphs. I get through two pages and then stop to check my e-mail—and down the rabbit hole I go. After all, one does not read “War and Peace” so much as suffer through it. It doesn’t help that the world at large, being so divorced from such pursuits, is often aggressive toward those who drop away into single-subject attention wells. People don’t like it when you read “War and Peace.” It’s too long, too boring, not worth the effort. And you’re elitist for trying.

In order to finish the thing in the two weeks I have allotted myself, I must read one hundred pages each day without fail. If something distracts me from my day’s reading—a friend in the hospital, a magazine assignment, sunshine—I must read two hundred pages on the following day. I’ve read at this pace before, in my university days, but that was years ago and I’ve been steadily down-training my brain ever since.

* * *

Another week has passed—my “War and Peace” struggle continues. I’ve realized now that the subject of my distraction is far more likely to be something I need to look at than something I need to do. There have always been activities—dishes, gardening, sex, shopping—that derail whatever purpose we’ve assigned to ourselves on a given day. What’s different now is the addition of so much content that we passively consume.

Only this morning I watched a boy break down crying on “X Factor,” then regain his courage and belt out a half-decent rendition of  Beyoncé’s “Listen”; next I looked up the original Beyoncé video and played it twice while reading the first few paragraphs of a story about the humanity of child soldiers; then I switched to a Nina Simone playlist prepared for me by Songza, which played while I flipped through a slide show of American soldiers seeing their dogs for the first time in years; and so on, ad nauseam. Until I shook I out of this funk and tried to remember what I’d sat down to work on in the first place.

* * *

If I’m to break from our culture of distraction, I’m going to need practical advice, not just depressing statistics. To that end, I switch gears and decide to stop talking to scientists for a while; I need to talk to someone who deals with attention and productivity in the so-called real world. Someone with a big smile and tailored suits such as organizational guru Peter Bregman. He runs a global consulting firm that gets CEOs to unleash the potential of their workers, and he’s also the author of the acclaimed business book 18 Minutes, which counsels readers to take a minute out of every work hour (plus five minutes at the start and end of the day) to do nothing but set an intention.

Bregman told me he sets his watch to beep every hour as a reminder that it’s time to right his course again. Aside from the intention setting, Bregman counsels no more than three e-mail check-ins a day. This notion of batch processing was anathema to someone like me, used to checking my in-box so constantly, particularly when my work feels stuck. “It’s incredibly inefficient to switch back and forth,” said Bregman, echoing every scientist I’d spoken to on multitasking. “Besides, e-mail is, actually, just about the least efficient mode of conversation you can have. And what we know about multitasking is that, frankly, you can’t. You just derail.”

“I just always feel I’m missing something important,” I said. “And that’s precisely why we lose hours every day, that fear.” Bregman argues that it’s people who can get ahead of that fear who end up excelling in the business world that he spends his own days in. “I think everyone is more distractible today than we used to be. It’s a very hard thing to fix. And as people become more distracted, we know they’re actually doing less, getting less done. Your efforts just leak out. And those who aren’t—aren’t leaking—are going to be the most successful.”

I hate that I leak. But there’s a religious certainty required in order to devote yourself to one thing while cutting off the rest of the world. We don’t know that the inbox is emergency-free, we don’t know that the work we’re doing is the work we ought to be doing. But we can’t move forward in a sane way without having some faith in the moment we’ve committed to. “You need to decide that things don’t matter as much as you might think they matter,” Bregman suggested as I told him about my flitting ways. And that made me think there might be a connection between the responsibility-free days of my youth and that earlier self’s ability to concentrate. My young self had nowhere else to be, no permanent anxiety nagging at his conscience. Could I return to that sense of ease? Could I simply be where I was and not seek out a shifting plurality to fill up my time?

* * *

It happened softly and without my really noticing.

As I wore a deeper groove into the cushions of my sofa, so the book I was holding wore a groove into my (equally soft) mind. Moments of total absence began to take hold more often; I remembered what it was like to be lost entirely in a well-spun narrative. There was the scene where Anna Mikhailovna begs so pitifully for a little money, hoping to send her son to war properly dressed. And there were, increasingly, more like it. More moments where the world around me dropped away and I was properly absorbed. A “causeless springtime feeling of joy” overtakes Prince Andrei; a tearful Pierre sees in a comet his last shimmering hope; Emperor Napoleon takes his troops into the heart of Russia, oblivious to the coming winter that will destroy them all…

It takes a week or so for withdrawal symptoms to work through a heroin addict’s body. While I wouldn’t pretend to compare severity here, doubtless we need patience, too, when we deprive ourselves of the manic digital distractions we’ve grown addicted to.

That’s how it was with my Tolstoy and me. The periods without distraction grew longer, I settled into the sofa and couldn’t hear the phone, couldn’t hear the ghost-buzz of something else to do. I’m teaching myself to slip away from the world again.

* * *

Yesterday I fell asleep on the sofa with a few dozen pages of “War and Peace” to go. I could hear my cell phone buzzing from its perch on top of the piano. I saw the glowing green eye of my Cyclops modem as it broadcast potential distraction all around. But on I went past the turgid military campaigns and past the fretting of Russian princesses, until sleep finally claimed me and my head, exhausted, dreamed of nothing at all. This morning I finished the thing at last. The clean edges of its thirteen hundred pages have been ruffled down into a paper cabbage, the cover is pilled from the time I dropped it in the bath. Holding the thing aloft, trophy style, I notice the book is slightly larger than it was before I read it.

It’s only after the book is laid down, and I’ve quietly showered and shaved, that I realize I haven’t checked my e-mail today. The thought of that duty comes down on me like an anvil.

Instead, I lie back on the sofa and think some more about my favorite reader Milton – about his own anxieties around reading. By the mid-1650s, he had suffered that larger removal from the crowds, he had lost his vision entirely and could not read at all—at least not with his own eyes. From within this new solitude, he worried that he could no longer meet his potential. One sonnet, written shortly after the loss of his vision, begins:

When I consider how my light is spent,

Ere half my days, in this dark world and wide, and that one Talent

which is death to hide Lodged with me useless . . .

Yet from that position, in the greatest of caves, he began producing his greatest work. The epic “Paradise Lost,” a totemic feat of concentration, was dictated to aides, including his three daughters.

Milton already knew, after all, the great value in removing himself from the rush of the world, so perhaps those anxieties around his blindness never had a hope of dominating his mind. I, on the other hand, and all my peers, must make a constant study of concentration itself. I slot my ragged “War and Peace” back on the shelf. It left its marks on me the same way I left my marks on it (I feel awake as a man dragged across barnacles on the bottom of some ocean). I think: This is where I was most alive, most happy. How did I go from loving that absence to being tortured by it? How can I learn to love that absence again?

This essay is adapted from “The End of Absence” by Michael Harris, published by Current / Penguin Random House.

 

http://www.salon.com/2014/08/17/war_and_peace_tortured_me_facebook_email_and_the_neuroscience_of_always_being_distracted/?source=newsletter

Face Time: Eternal Youth Has Become a Growth Industry in Silicon Valley

Tuesday, Aug 12 2014

The students of Timothy Draper’s University of Heroes shuffle into a conference room, khaki shorts swishing against their knees, flip-flops clacking against the carpeted floor. One by one they take their seats and crack open their laptops, training their eyes on Facebook home pages or psychedelic screen savers. An air conditioner whirs somewhere in the rafters. A man in chinos stands before them.

The man is Steve Westly, former state controller, prominent venture capitalist, 57-year-old baron of Silicon Valley. He smiles at the group with all the sheepishness of a student preparing for show-and-tell. He promises to be brief.

“People your age are changing the world,” Westly tells the students, providing his own list of great historical innovators: Napoleon, Jesus, Zuckerberg, Larry, Sergey. “It’s almost never people my age,” he adds.

Students at Draper University — a private, residential tech boot camp launched by venture capitalist Timothy Draper, in what was formerly San Mateo’s Benjamin Franklin Hotel — have already embraced Westly’s words as a credo. They inhabit a world where success and greatness seem to hover within arm’s reach. A small handful of those who complete the six-week, $9,500 residential program might get a chance to join Draper’s business incubator; an even smaller handful might eventually get desks at an accelerator run by Draper’s son, Adam. It’s a different kind of meritocracy than Westly braved, pursuing an MBA at Stanford in the early ’80s. At Draper University, heroism is merchandised, rather than earned. A 20-year-old with bright eyes and deep pockets (or a parent who can front the tuition) has no reason to think he won’t be the next big thing.

This is the dogma that glues Silicon Valley together. Young employees are plucked out of high school, college-aged interns trade their frat houses and dorm rooms for luxurious corporate housing. Twenty-seven-year-old CEOs inspire their workers with snappy jingles about moving fast and breaking things. Entrepreneurs pitch their business plans in slangy, tech-oriented patois.

Gone are the days of the “company man” who spends 30 years ascending the ranks in a single corporation. Having an Ivy League pedigree and a Brooks Brothers suit is no longer as important.

“Let’s face it: The days of the ‘gold watch’ are over,” 25-year-old writer David Burstein says. “The average millennial is expected to have several jobs by the time he turns 38.”

Yet if constant change is the new normal, then older workers have a much harder time keeping up. The Steve Westlys of the world are fading into management positions. Older engineers are staying on the back-end, working on system administration or architecture, rather than serving as the driving force of a company.

“If you lost your job, it might be hard to find something similar,” a former Google contractor says, noting that an older engineer might have to settle for something with a lower salary, or even switch fields. The contractor says he knows a man who graduated from Western New England University in the 1970s with a degree in the somewhat archaic field of time-motion engineering. That engineer wound up working at Walmart.

Those who do worm their way into the Valley workforce often have a rough adjustment. The former contractor, who is in his 40s, says he was often the oldest person commuting from San Francisco to Mountain View on a Google bus. And he adhered to a different schedule: Wake up at 4:50 a.m., get out the door by 6:20, catch the first coach home at 4:30 p.m. to be home for a family supper. He was one of the few people who didn’t take advantage of the free campus gyms or gourmet cafeteria dinners or on-site showers. He couldn’t hew to a live-at-work lifestyle.

And compared to other middle-aged workers, he had it easy.

In a lawsuit filed in San Francisco Superior Court in July, former Twitter employee Peter H. Taylor claims he was canned because of his age, despite performing his duties in “an exemplary manner.” Taylor, who was 57 at the time of his termination in September of last year, says his supervisor made at least one derogatory remark about his age, and that the company refused to accommodate his disabilities following a bout with kidney stones. He says he was ultimately replaced by several employees in their 20s and 30s. A Twitter spokesman says the lawsuit is without merit and that the company will “vigorously” defend itself.

The case is not without precedent. Computer scientist Brian Reid lobbed a similar complaint against Google in 2004, claiming co-workers called him an “old man” and an “old fuddy-duddy,” and routinely told him he was not a “cultural fit” for the company. Reid was 54 at the time he filed the complaint; he settled for an undisclosed amount of money.

What is surprising, perhaps, is that a 57-year-old man was employed at Twitter at all. “Look, Twitter has no 50-year-old employees,” the former Google contractor says, smirking. “By the time these [Silicon Valley] engineers are in their 40s, they’re old — they have houses, boats, stock options, mistresses. They drive to work in Chevy Volts.”

There’s definitely a swath of Valley nouveau riche who reap millions in their 20s and 30s, and who are able to cash out and retire by age 40. But that’s a minority of the population. The reality, for most people, is that most startups fail, most corporations downsize, and most workforces churn. Switching jobs every two or three years might be the norm, but it’s a lot easier to do when you’re 25 than when you’re 39. At that point, you’re essentially a senior citizen, San Francisco botox surgeon Seth Matarasso says.

“I have a friend who lived in Chicago and came back to Silicon Valley at age 38,” Matarasso recalls. “And he said, ‘I feel like a grandfather — in Chicago I just feel my age.”

Retirement isn’t an option for the average middle-aged worker, and even the elites — people like Westly, who were once themselves wunderkinds — find themselves in an awkward position when they hit their 50s, pandering to audiences that may have no sense of what came before. The diehards still work well past their Valley expiration date, but then survival becomes a job unto itself. Sometimes it means taking lower-pay contract work, or answering to a much younger supervisor, or seeking workplace protection in court.

CONTINUED: http://www.sfweekly.com/sanfrancisco/silicon-valley-bottom-age-discrimination/Content?oid=3079530