How to Stop Time

Credit Viktor Hachmang

IN the unlikely event that we could ever unite under the banner of a single saint, it might just be St. Expeditus. According to legend, when the Roman centurion decided to convert to Christianity, the Devil appeared in the form of a crow and circled above him crying “cras, cras” — Latin for “tomorrow, tomorrow.” Expeditus stomped on the bird and shouted victoriously, “Today!” For doing so, Expeditus achieved salvation, and is worshiped as the patron saint of procrastinators. Sometimes you see icons of him turned upside down like an hourglass in the hope that he’ll hurry up and help you get your work done so he can be set right-side up again. From job-seekers in Brazil to people who run e-commerce sites in New Orleans, Expeditus is adored not just for his expediency, but also for his power to settle financial affairs. There is even a novena to the saint on Facebook.

Expeditus was martyred in A.D. 303, but was resurrected around the time of the Industrial Revolution, as the tempo of the world accelerated with breathtaking speed. Sound familiar? Today, as the pace of our lives quickens and the demands placed on us multiply, procrastination is the archdemon many of us wrestle with daily. It would seem we need Expeditus more than ever.

Photo

Credit Viktor Hachmang

“Procrastination, quite frankly, is an epidemic,” declares Jeffery Combs, the author of “The Procrastination Cure,” just one in a vast industry of self-help books selling ways to crush the beast. The American Psychological Association estimates that 20 percent of American men and women are “chronic procrastinators.” Figures place the amount of money lost in the United States to procrastinating employees at trillions of dollars a year.

A recent infographic in The Economist revealed that in the 140 million hours humanity spent watching “Gangnam Style” on YouTube two billion times, we could have built at least four more (desperately needed) pyramids at Giza. Endless articles pose the question of why we procrastinate, what’s going wrong in the brain, how to overcome it, and the fascinating irrationality of it all.

But if procrastination is so clearly a society-wide, public condition, why is it always framed as an individual, personal deficiency? Why do we assume our own temperaments and habits are at fault — and feel bad about them — rather than question our culture’s canonization of productivity?

I was faced with these questions at an unlikely event this past July — an academic conference on procrastination at the University of Oxford. It brought together a bright and incongruous crowd: an economist, a poetry professor, a “biographer of clutter,” a queer theorist, a connoisseur of Iraqi coffee-shop culture. There was the doctoral student who spoke on the British painter Keith Vaughan, known to procrastinate through increasingly complicated experiments in auto-erotica. There was the children’s author who tied herself to her desk with her shoelaces.

The keynote speaker, Tracey Potts, brought a tin of sugar cookies she had baked in the shape of the notorious loiterer Walter Benjamin. The German philosopher famously procrastinated on his “Arcades Project,” a colossal meditation on the cityscape of Paris where the figure of the flâneur — the procrastinator par excellence — would wander. Benjamin himself fatally dallied in escaping the city ahead of the Nazis. He took his own life, leaving the manuscript forever unfinished, more evidence, it would seem, that no avoidable delay goes unpunished.

As we entered the ninth, grueling hour of the conference, a professor laid out a taxonomy of dithering so enormous that I couldn’t help but wonder: Whatever you’re doing, aren’t you by nature procrastinating from doing something else? Seen in this light, procrastination begins to look a lot like just plain existing. But then along come its foot soldiers — guilt, self-loathing, blame.

Dr. Potts explained how procrastination entered the field as pathological behavior in the mid-20th century. Drawing on the work of the British-born historian Christopher Lane, Dr. Potts directed our attention to a United States War Department bulletin issued in 1945 that chastised soldiers who were avoiding their military duties “by passive measures, such as pouting, stubbornness, procrastination, inefficiency and passive obstructionism.” In 1952, when the American Psychiatric Association assembled the first edition of the Diagnostic and Statistical Manual of Mental Disorders — the bible of mental health used to determine illness to this day — it copied the passage from the cranky military memo verbatim.

And so, procrastination became enshrined as a symptom of mental illness. By the mid-60s, passive-aggressive personality disorder had become a fairly common diagnosis and “procrastination” remained listed as a symptom in several subsequent editions. “Dawdling” was added to the list, after years of delay.

While passive-aggressive personality disorder has been erased from the official portion of the manual, the stigma of slothfulness remains. Many of us, it seems, are still trying to enforce a military-style precision on our intellectual, creative, civilian lives — and often failing. Even at the conference, participants proposed strategies for beating procrastination that were chillingly martial. The economist suggested that we all “take hostages” — place something valuable at stake as a way of negotiating with our own belligerent minds. The children’s author writes large checks out to political parties she loathes, and entrusts them to a relative to mail if she misses a deadline.

All of which leads me to wonder: Are we imposing standards on ourselves that make us mad?

Though Expeditus’s pesky crow may be ageless, procrastination as epidemic — and the constant guilt that goes with it — is peculiar to the modern era. The 21st-century capitalist world, in its never-ending drive for expansion, consecrates an always-on productivity for the sake of the greater fiscal health.

In an 1853 short story Herman Melville gave us Bartleby, the obstinate scrivener and apex procrastinator, who confounds the requests of his boss with his hallowed mantra, “I would prefer not to.” A perfect employee on the surface — he never leaves the office and sleeps at his desk — Bartleby represents a total rebellion against the expectations placed on him by society. Politely refusing to accept money or to remove himself from his office even after he is fired, the copyist went on to have an unexpected afterlife — as hero for the Occupy movement in 2012. “Bartleby was the first laid-off worker to occupy Wall Street,” Jonathan D. Greenberg noted in The Atlantic. Confronted with Bartleby’s serenity and his utter noncompliance with the status quo, his perplexed boss is left wondering whether he himself is the one who is mad.

A month before the procrastination conference, I set myself the task of reading “Oblomov,” the 19th-century Russian novel by Ivan Goncharov about the ultimate slouch, who, over the course of 500 pages, barely moves from his bed, and then only to shift to the sofa. At least that’s what I heard: I failed to make it through more than two pages at a sitting without putting the novel down and allowing myself to drift off. I would carry the heavy book everywhere with me — it was like an anchor into a deep, blissful sea of sleep.

Oblomov could conduct the few tasks he cared to from under his quilt — writing letters, accepting visitors — but what if he’d had an iPhone and a laptop? Being in bed is now no excuse for dawdling, and no escape from the guilt that accompanies it. The voice — societal or psychological — urging us away from sloth to the pure, virtuous heights of productivity has become a sort of birdlike shriek as more individuals work from home and set their own schedules, and as the devices we use for work become alluring sirens to our own distraction. We are now able to accomplish tasks at nearly every moment, even if we prefer not to.

Still, humans will never stop procrastinating, and it might do us good to remember that the guilt and shame of the do-it-tomorrow cycle are not necessarily inescapable. The French philosopher Michel Foucault wrote about mental illness that it acquires its reality as an illness “only within a culture that recognizes it as such.” Why not view procrastination not as a defect, an illness or a sin, but as an act of resistance against the strictures of time and productivity imposed by higher powers? To start, we might replace Expeditus with a new saint.

At the conference, I was invited to speak about the Egyptian-born novelist Albert Cossery, a true icon of the right to remain lazy. In the mid-1940s, Cossery wrote a novel in French, “Laziness in the Fertile Valley,” about a family in the Nile Delta that sleeps all day. Their somnolence is a form of protest against a world forever ruled by tyrants winding the clock. Born in 1913 in Cairo, Cossery grew up in a place that still retained cultural memories of the introduction of Western notions of time, a once foreign concept. It had arrived along with British military forces in the late 19th century. To turn Egypt into a lucrative colony, it needed to run on a synchronized, efficient schedule. The British replaced the Islamic lunar calendar with the Gregorian, preached the values of punctuality, and spread the gospel that time equaled money.

Firm in his belief that time is not as natural or apolitical as we might think, Cossery, in his writings and in his life, strove to reject the very system in which procrastination could have any meaning at all. Until his death in 2008, the elegant novelist, living in Paris, maintained a strict schedule of idleness. He slept late, rising in the afternoons for a walk to the Café de Flore, and wrote fiction only when he felt like it. “So much beauty in the world, so few eyes to see it,” Cossery would say. He was the archetypal flâneur, in the footsteps of Walter Benjamin and Charles Baudelaire, whose verses Cossery would steal for his own poetry when he was a teenager. Rather than charge through the day, storming the gates of tomorrow, his stylized repose was a perch from which to observe, reflect and question whether the world really needs all those things we feel we ought to get done — like a few more pyramids at Giza. And it was idleness that led Cossery to true creativity, dare I say it, in his masterfully unprolific work.

After my talk, someone came up to ask me what I thought was the ideal length of a nap. Saint Cossery was smiling. Already one small battle had been won.

1 in 5 Americans still can’t find a job

people sitting on a bench waiting

A new report about the lingering effects of the Great Recession finds that about 20 percent of Americans who lost their job during the last five years are still unemployed and looking for work.

Approximately half of the laid-off workers who found work were paid less in their new positions; one in four say their new job was only temporary.

“While the worst effects of the Great Recession are over for most Americans, the brutal realities of diminished living standards endure for the three million American workers who remain jobless years after they were laid off,” says Carl Van Horn, a professor at Rutgers who co-authored the study with Professor Cliff Zukin.

“These long-term unemployed workers have been left behind to fend for themselves as they struggle to pull their lives back together.”

As of last August, 3 million Americans—nearly one in three unemployed workers—have been unemployed for more than six months, and more than 2 million Americans have been out of work for more than a year, the researchers say.

While the percentage of the long-term unemployed (workers who have been unemployed for more than six months) has declined from 46 percent in 2010, it is still above the 26 percent level experienced in the worst previous recession in 1983.

Job training

The national study found that only one in five of the long-term unemployed received help from a government agency when looking for a job; only 22 percent enrolled in a training program to develop skills for a new job; and 60 percent received no government assistance beyond unemployment benefits.

Nearly two-thirds of Americans support increasing funds for long-term education and training programs, and greater spending on roads and highways in order to assist unemployed workers.

For the survey, the Heldrich Center interviewed a representative sample of 1,153 Americans, including 394 unemployed workers looking for work, 389 Americans who have been unemployed for more than six months or who were unemployed for a period of more than six months at some point in the last five years, and 463 individuals who currently have jobs.

Other findings

  • More than seven in 10 long-term unemployed say they have less in savings and income than they did five years ago.
  • More than eight in 10 of the long-term unemployed rate their personal financial situation negatively as only fair or poor.
  • More than six in 10 unemployed and long-term unemployed say they experienced stress in family relationships and close friendships during their time without a job.
  • Fifty-five percent of the long-term unemployed say they will need to retire later than planned because of the recession, while 5 percent say the weak economy forced them into early retirement.
  • Nearly half of the long-term unemployed say it will take three to 10 years for their families to recover financially. Another one in five say it will take longer than that or that they will never recover.

Source: Rutgers

http://www.futurity.org/unemployed-americans-great-recession-772342/?utm_source=Futurity+Today&utm_campaign=522dd455e5-September_26_20149_26_2014&utm_medium=email&utm_term=0_e34e8ee443-522dd455e5-205990505

Why Facebook, Google, and the NSA Want Computers That Learn Like Humans

Deep learning could transform artificial intelligence. It could also get pretty creepy.

lol cats

Illustration: Quickhoney

In June 2012, a Google supercomputer made an artificial-intelligence breakthrough: It learned that the internet loves cats. But here’s the remarkable part: It had never been told what a cat looks like. Researchers working on the Google Brain project in the company’s X lab fed 10 million random, unlabeled images from YouTube into their massive network and instructed it to recognize the basic elements of a picture and how they fit together. Left to their own devices, the Brain’s 16,000 central processing units noticed that a lot of the images shared similar characteristics that it eventually recognized as a “cat.” While the Brain’s self-taught knack for kitty spotting was nowhere as good as a human’s, it was nonetheless a major advance in the exploding field of deep learning.

The dream of a machine that can think and learn like a person has long been the holy grail of computer scientists, sci-fi fans, and futurists alike. Deep learning—algorithms inspired by the human brain and its ability to soak up massive amounts of information and make complex predictions—might be the closest thing yet. Right now, the technology is in its infancy: Much like a baby, the Google Brain taught itself how to recognize cats, but it’s got a long way to go before it can figure out that you’re sad because your tabby died. But it’s just a matter of time. Its potential to revolutionize everything from social networking to surveillance has sent tech companies and defense and intelligence agencies on a deep-learning spending spree.

What really puts deep learning on the cutting edge of artificial intelligence (AI) is that its algorithms can analyze things like human behavior and then make sophisticated predictions. What if a social-networking site could figure out what you’re wearing from your photos and then suggest a new dress? What if your insurance company could diagnose you as diabetic without consulting your doctor? What if a security camera could tell if the person next to you on the subway is carrying a bomb?

And unlike older data-crunching models, deep learning doesn’t slow down as you cram in more info. Just the opposite—it gets even smarter. “Deep learning works better and better as you feed it more data,” explains Andrew Ng, who oversaw the cat experiment as the founder of Google’s deep-learning team. (Ng has since joined the Chinese tech giant Baidu as the head of its Silicon Valley AI team.)

And so the race to build a better virtual brain is on. Microsoft plans to challenge the Google Brain with its own system called Adam. Wired reported that Apple is applying deep learning to build a “neural-net-boosted Siri.” Netflix hopes the technology will improve its movie recommendations. Google, Yahoo, Twitter, and Pinterest have snapped up deep-learning companies; Google has used the technology to read every house number in France in less than an hour. “There’s a big rush because we think there’s going to be a bit of a quantum leap,” says Yann LeCun, a deep-learning pioneer and the head of Facebook’s new AI lab.

What if your insurance company diagnosed you without consulting your doctor? What if a security camera could tell if the person next to you is carrying a bomb?

Last December, Facebook CEO Mark Zuckerberg appeared, bodyguards in tow, at the Neural Information Processing Systems conference in Lake Tahoe, where insiders discussed how to make computers learn like humans. He has said that his company seeks to “use new approaches in AI to help make sense of all the content that people share.” Facebook researchers have used deep learning to identify individual faces from a giant database called “Labeled Faces in the Wild” with more than 97 percent accuracy. Another project, dubbed PANDA (Pose Aligned Networks for Deep Attribute Modeling), can accurately discern gender, hairstyles, clothing styles, and facial expressions from photos. LeCun says that these types of tools could improve the site’s ability to tag photos, target ads, and determine how people will react to content.

Yet considering recent news that Facebook secretly studied 700,000 users’ emotions by tweaking their feeds or that the National Security Agency harvests 55,000 facial images a day, it’s not hard to imagine how these attempts to better “know” you might veer into creepier territory.

Not surprisingly, deep learning’s potential for analyzing human faces, emotions, and behavior has attracted the attention of national-security types. The Defense Advanced Research Projects Agency has worked with researchers at New York University on a deep-learning program that sought, according to a spokesman, “to distinguish human forms from other objects in battlefield or other military environments.”

Chris Bregler, an NYU computer science professor, is working with the Defense Department to enable surveillance cameras to detect suspicious activity from body language, gestures, and even cultural cues. (Bregler, who grew up near Heidelberg, compares it to his ability to spot German tourists in Manhattan.) His prototype can also determine whether someone is carrying a concealed weapon; in theory, it could analyze a woman’s gait to reveal she is hiding explosives by pretending to be pregnant. He’s also working on an unnamed project funded by “an intelligence agency”—he’s not permitted to say more than that.

And the NSA is sponsoring deep-learning research on language recognition at Johns Hopkins University. Asked whether the agency seeks to use deep learning to track or identify humans, spokeswoman Vanee’ Vines only says that the agency “has a broad interest in deriving knowledge from data.”

Mark Zuckerberg

Mark Zuckerberg has said that Facebook seeks to “use new approaches in AI to help make sense of all the content that people share.” AP Photo/Ben Margot

Deep learning also has the potential to revolutionize Big Data-driven industries like banking and insurance. Graham Taylor, an assistant professor at the University of Guelph in Ontario, has applied deep-learning models to look beyond credit scores to determine customers’ future value to companies. He acknowledges that these types of applications could upend the way businesses treat their customers: “What if a restaurant was able to predict the amount of your bill, or the probability of you ever returning? What if that affected your wait time? I think there will be many surprises as predictive models become more pervasive.”

Privacy experts worry that deep learning could also be used in industries like banking and insurance to discriminate or effectively redline consumers for certain behaviors. Sergey Feldman, a consultant and data scientist with the brand personalization company RichRelevance, imagines a “deep-learning nightmare scenario” in which insurance companies buy your personal information from data brokers and then infer with near-total accuracy that, say, you’re an overweight smoker in the early stages of heart disease. Your monthly premium might suddenly double, and you wouldn’t know why. This would be illegal, but, Feldman says, “don’t expect Congress to protect you against all possible data invasions.”

An NSA spokeswoman only says that the agency “has a broad interest in deriving knowledge from data.”

And what if the computer is wrong? If a deep-learning program predicts that you’re a fraud risk and blacklists you, “there’s no way to contest that determination,” says Chris Calabrese, legislative counsel for privacy issues at the American Civil Liberties Union.

Bregler agrees that there might be privacy issues associated with deep learning, but notes that he tries to mitigate those concerns by consulting with a privacy advocate. Google has reportedly established an ethics committee to address AI issues; a spokesman says its deep-learning research is not primarily about analyzing personal or user-specific data—for now. While LeCun says that Facebook eventually could analyze users’ data to inform targeted advertising, he insists the company won’t share personally identifiable data with advertisers.

“The problem of privacy invasion through computers did not suddenly appear because of AI or deep learning. It’s been around for a long time,” LeCun says. “Deep learning doesn’t change the equation in that sense, it just makes it more immediate.” Big companies like Facebook “thrive on the trust users have in them,” so consumers shouldn’t worry about their personal data being fed into virtual brains. Yet, as he notes, “in the wrong hands, deep learning is just like any new technology.”

Deep learning, which also has been used to model everything from drug side effects to energy demand, could “make our lives much easier,” says Yoshua Bengio, head of the Machine Learning Laboratory at the University of Montreal. For now, it’s still relatively difficult for companies and governments to efficiently sift through all our emails, texts, and photos. But deep learning, he warns, “gives a lot of power to these organizations.”

Internet Trolls Are Narcissists, Psychopaths, and Sadists

A new study shows that internet trolls really are just terrible human beings.

In this month’s issue of Personality and Individual Differences, a study was published that confirms what we all suspected: internet trolls are horrible people. Let’s start by getting our definitions straight. An internet troll is someone who comes into a discussion and posts comments designed to upset or disrupt the conversation. Often, it seems like there is no real purpose behind their comments except to upset everyone else involved. Trolls will lie, exaggerate, and offend to get a response.

What kind of person would do this?

Canadian researchers decided to find out. They conducted two internet studies with over 1,200 people. They gave personality tests to each subject along with a survey about their internet commenting behavior. They were looking for evidence that linked trolling with the Dark Tetrad of personality: narcissism, Machiavellianism, psychopathy, and sadistic personality.

[Edit to add: these are technical terms with formalized surveys to measure them. You can find lots more information about their formal definitions online]

They found that Dark Tetrad scores were highest among people who said trolling was their favorite internet activity. To get an idea of how much more prevalent these traits were among internet trolls, check out this figure from the paper:

 

Look at how low the scores are for everyone except the internet trolls! Their scores for all four terrible personality traits soar on the chart. The relationship between this Dark Tetrad and trolling is so significant, that the authors write the following in their paper:

“… the associations between sadism and GAIT (Global Assessment of Internet Trolling) scores were so strong that it might be said that online trolls are prototypical everyday sadists.” [emphasis added]

Trolls truly enjoy making you feel bad. To quote the authors once more (because this is a truly quotable article):

“Both trolls and sadists feel sadistic glee at the distress of others. Sadists just want to have fun … and the Internet is their playground!”

So next time you encounter a troll online, remember a few things. (1) These trolls are some truly messed up people and (2) it is your suffering that brings them pleasure, so the best thing you can do is ignore them.

References

Buckels, Erin E., Paul D. Trapnell, and Delroy L. Paulhus. “Trolls just want to have fun.” Personality and Individual Differences67 (2014): 97-102.

Photo adapted from original by Kevin Dooley

http://www.psychologytoday.com/blog/your-online-secrets/201409/internet-trolls-are-narcissists-psychopaths-and-sadists

Why do so many poor people eat junk food, fail to budget properly, show no ambition?

‘Poor people don’t plan long-term. We’ll just get our hearts broken’

Linda Tirado knew exactly why… because she was one of them. Here, in an extract from her book, Hand to Mouth, she tells her story in her own words

Linda Tirado photographed by Scott Suchman near her home in Washington DC for the Observer New Revie
Linda Tirado photographed by Scott Suchman near her home in Washington DC for the Observer New Review.

In the autumn of 2013 I was in my first term of school in a decade. I had two jobs; my husband, Tom, was working full-time; and we were raising our two small girls. It was the first time in years that we felt like maybe things were looking like they’d be OK for a while.

After a gruelling shift at work, I was unwinding online when I saw a question from someone on a forum I frequented: Why do poor people do things that seem so self-destructive? I thought I could at least explain what I’d seen and how I’d reacted to the pressures of being poor. I wrote my answer to the question, hit post, and didn’t think more about it for at least a few days. This is what it said:

Why I make terrible decisions, or, poverty thoughts

There’s no way to structure this coherently. They are random observations that might help explain the mental processes. But often, I think that we look at the academic problems of poverty and have no idea of the why. We know the what and the how, and we can see systemic problems, but it’s rare to have a poor person actually explain it on their own behalf. So this is me doing that, sort of.

Rest is a luxury for the rich. I get up at 6am, go to school (I have a full course load, but I only have to go to two in-person classes), then work, then I get the kids, then pick up my husband, then have half an hour to change and go to Job 2. I get home from that at around 12.30am, then I have the rest of my classes and work to tend to. I’m in bed by 3am. This isn’t every day, I have two days off a week from each of my obligations. I use that time to clean the house and soothe Mr Martini [her partner], see the kids for longer than an hour and catch up on schoolwork.

Those nights I’m in bed by midnight, but if I go to bed too early I won’t be able to stay up the other nights because I’ll fuck my pattern up, and I drive an hour home from Job 2 so I can’t afford to be sleepy. I never get a day off from work unless I am fairly sick. It doesn’t leave you much room to think about what you are doing, only to attend to the next thing and the next. Planning isn’t in the mix.

When I was pregnant the first time, I was living in a weekly motel for some time. I had a mini-fridge with no freezer and a microwave. I was on WIC [government-funded nutritional aid for women, infants and children]. I ate peanut butter from the jar and frozen burritos because they were 12 for $2. Had I had a stove, I couldn’t have made beef burritos that cheaply. And I needed the meat, I was pregnant. I might not have had any prenatal care, but I am intelligent enough to eat protein and iron while knocked up.

I know how to cook. I had to take Home Ec to graduate from high school. Most people on my level didn’t. Broccoli is intimidating. You have to have a working stove, and pots, and spices, and you’ll have to do the dishes no matter how tired you are or they’ll attract bugs. It is a huge new skill for a lot of people. That’s not great, but it’s true. If you fuck it up, you could make your family sick.

We have learned not to try too hard to be middle class. It never works out well and always makes you feel worse for having tried and failed yet again. Better not to try. It makes more sense to get food that you know will be palatable and cheap and that keeps well. Junk food is a pleasure that we are allowed to have; why would we give that up?

We have very few of them.

The closest Planned Parenthood [family planning clinic] to me is three hours. That’s a lot of money in gas. Lots of women can’t afford that, and even if you live near one you probably don’t want to be seen coming in and out in a lot of areas. We’re aware that we are not “having kids”, we’re “breeding”. We have kids for much the same reasons that I imagine rich people do. Urge to propagate and all. Nobody likes poor people procreating, but they judge abortion even harder.

Convenience food is just that. And we are not allowed many conveniences. Especially since the Patriot Act [aimed at strengthening domestic security in the war against terrorism] was passed, it’s hard to get a bank account. But without one, you spend a lot of time figuring out where to cash a cheque and get money orders to pay bills. Most motels now have a no-credit-card-no-room policy. I wandered around San Francisco for five hours in the rain once with nearly a thousand dollars on me and could not rent a room even if I gave them a $500 cash deposit and surrendered my cellphone to the desk to hold as surety.

Nobody gives enough thought to depression. You have to understand that we know that we will never not feel tired. We will never feel hopeful. We will never get a vacation.

Patients without medical insurance flock to a free dentistry event in Los Angeles.
Patients without medical insurance flock to a free dentistry event in Los Angeles. Photograph: Robyn Beck/AFP/Getty Images

Ever. We know that the very act of being poor guarantees that we will never not be poor. It doesn’t give us much reason to improve ourselves. We don’t apply for jobs because we know we can’t afford to look nice enough to hold them. I would make a super legal secretary but I’ve been turned down more than once because I “don’t fit the image of the firm”, which is a nice way of saying “gtfo, pov”. I am good enough to cook the food, hidden away in the kitchen, but my boss won’t make me a server because I don’t “fit the corporate image”. I am not beautiful. I have missing teeth and skin that looks like it will when you live on B12 and coffee and nicotine and no sleep. Beauty is a thing you get when you can afford it, and that’s how you get the job that you need in order to be beautiful. There isn’t much point trying.

CONTINUED:   http://www.theguardian.com/society/2014/sep/21/linda-tirado-poverty-hand-to-mouth-extract?CMP=fb_gu

Professors on food stamps

The shocking true story of academia in 2014

Forget minimum wage, some adjunct professors say they’re making 50 cents an hour. Wait till you read these stories

Professors on food stamps: The shocking true story of academia in 2014
(Credit: domin_domin via iStock/Roobcio via Shutterstock/Salon)

You’ve probably heard the old stereotypes about professors in their ivory tower lecturing about Kafka while clad in a tweed jacket. But for many professors today, the reality is quite different: being so poorly paid and treated, that they’re more likely to be found bargain-hunting at day-old bread stores. This is academia in 2014.

“The most shocking thing is that many of us don’t even earn the federal minimum wage,” said Miranda Merklein, an adjunct professor from Santa Fe who started teaching in 2008. “Our students didn’t know that professors with PhDs aren’t even earning as much as an entry-level fast food worker. We’re not calling for the $15 minimum wage. We don’t even make minimum wage. And we have no benefits and no job security.”

Over three quarters of college professors are adjunct. Legally, adjunct positions are part-time, at-will employment. Universities pay adjunct professors by the course, anywhere between $1,000 to $5,000. So if a professor teaches three courses in both the fall and spring semesters at a rate of $3000 per course, they’ll make $18,000 dollars. The average full-time barista makes the same yearly wage. However, a full-time adjunct works more than 40 hours a week. They’re not paid for most of those hours.

“If it’s a three credit course, you’re paid for your time in the classroom only,” said Merklein. “So everything else you do is by donation. If you hold office hours, those you’re doing for free. Your grading you do for free. … Anything we do with the student where we sit down and explain what happened when the student was absent, that’s also free labor. Some would call it wage theft because these are things we have to do in order to keep our jobs. We have to do things we’re not getting paid for. It’s not optional.”

Merklein was far from the only professor with this problem.



“It can be a tremendous amount of work,” said Alex Kudera. Kudera started teaching in 1996 and is the author of a novel about adjunct professorship, “Fight For Your Long Day.” “When I was an adjunct, I didn’t have a social life. It’s basically just work all the time. You plan your weekend around the fact that you’re going to be doing work Saturday and Sunday — typically grading papers, which is emotionally exhausting. The grading can be tedious but at least it’s a private thing. It’s basically 5-10 hours a day for every day of the week.”

One professor from Indiana who spoke to Salon preferred to remain anonymous. “At some point early in my adjunct career, I broke down my pay hourly. I figured out that I was making under minimum wage and then I stopped thinking about it,” he said. “I can’t speak for everyone, but I essentially design my own courses. And sometimes I don’t find out how many courses I’m going to be teaching until maybe Thursday and they start Monday. … So I have to develop a course, and it’s been the case where one summer I taught English 102 where the course was literally dropped in my lap three days before it started and I had to develop it entirely from scratch. It didn’t even have a text book. That was three 16-hour days in a row developing a syllabus. … You’re expected to be in contact with students constantly. You have to be available to them all the time. You’re expected to respond to emails generally within 24 hours. I’m always on-call. And it’s one of my favorite parts of my job, I don’t regret it, but if you factored those on-call hours in, that’d be the end of it. I’d be making 50 cents an hour.”

Being financially secure and teaching at an institute of higher education are almost mutually exclusive, even among professors who are able to teach the maximum amount of courses each semester. Thus, more than half of adjunct professors in the United States seek a second job. Not all professors can find additional employment. An advanced degree slams most doors shut and opens a handful by the narrowest crack.

Nathaniel Oliver taught as an adjunct for four years in Alabama. He received $12,000 a year during his time teaching.

“You fall in this trap where you may be working for less than you would be at a place that pays minimum wage yet you can’t get the minimum wage jobs because of your education,” Oliver said.

Academia’s tower might be ivory but it casts an obsidian shadow. Oliver was one of many professors trapped in the oxymoronic life of pedantic destitution. Some professors in his situation became homeless. Oliver was “fortunate” enough to only require food stamps, a fact of life for many adjuncts.

“It’s completely insane,” he said. “And this isn’t happening just to me. More and more people are doing it.”

“We have food stamps,” said the anonymous adjunct from Indiana. “We wouldn’t be able to survive without them.”

“Many professors are on food stamps and they go to food donation centers. They donate plasma. And that’s a pretty regular occurrence,” Merklein told Salon.

Life isn’t much easier for those lucky enough to find another income stream. Many are reduced to menial service jobs and other forms of first-world deprivation.

“I ended up applying for a job in a donut shop recently,” said an Ohio professor who requested to go by a pseudonym. Professor Doe taught for over two decades. Many years he only made $9600. Resorting to a food service job was the only way he could afford to live, but it came with more than its expected share of humiliation.

“One of the managers there is one of the students I had a year ago who was one of the very worst writers I’ve ever had. What are we really saying here? What’s going on in the work world? Something does not seem quite right. I’m not asking to be rich. I’m not asking to be famous. I just want to pay my bills.”

Life became even more harrowing for adjuncts after the Affordable Care Act when universities slashed hours and health insurance coverage became even more difficult to obtain.

“They’re no better off than people who work at Walmart,” said Gordon Haber, a 15-year adjunct professor and author of “Adjunctivitis.”

Perhaps not surprisingly, other professors echoed this sentiment.

“There’s this idea that faculty are cheap, renewable labor. There’s the idea that student are customers or clients,” said Joseph Fruscione, a former adjunct of 15 years. “And there are some cases where if a student is displeased with a grade, there’s the notion where they’re paying for this, so they deserve an A or a B because of all this tuition.”

“The Walmart metaphor is vivid,” Kudera said. “There are these random schools where they’re just being terrible. But as some of the schools it seems like there’s some enlightened schools and it doesn’t seem like every single person who speaks up loses their classes. It varies school to school. They’re well aware some of their adjuncts may not afford toothpaste at the end of the month or whatever those kinds of tragedies may be.” He suggested looking at the hashtag #badmin to see transgressions and complaints documented in real time.

Robert Baum, a former adjunct and now a dean, was able to provide insights from both sides of the problem.

“That pressure [to make money] has been on higher education forever,” he said. “A lot of the time when I was an adjunct, things were very black and what I’m finding is that the graying is happening a lot. I’m losing track of the black and white.” Still, Baum noted that the current system was hardly ideal, and that change was necessary. “The Walmart model is based on the idea of putting the burden on taking care of the worker on either the state or on the worker’s credit card or on the worker’s family. And that is no different than what I’ve experienced across my adjunct life. No different. Zero difference.”

Ana Fores Tamayo, an adjunct who claims she was blacklisted over her activism, agreed with the latter parts of Baum’s assessment.

“Walmart and the compartmentalized way of treating faculty is the going rate. The way administration turns around and says, for instance, where I was teaching it was probably about 65% adjunct faculty. But the way they fix their numbers, it makes it looks as if it’s less when they show their books because the way they divide it and the way they play with their numbers it shows that it’s less.”

“As soon as they hear about you organizing, they go on the defensive,” Merklein said. “For instance, at my community college, I am being intimidated constantly and threatened in various ways, hypothetically usually. They don’t like to say something that’s an outright direct threat. … They get really freaked out when they see pamphlets around the adjunct faculty office and everyone’s wearing buttons regardless of what professional organization or union it is. They will then go on the offensive. They will usually contact their attorney who is there to protect the school as a business and to act in an anti-labor capacity.”

The most telling phrase in Merklein’s words are “the school as a business.” Colleges across the country have transitioned from bastions of intellectual enlightenment to resort hotels prizing amenities above academics. Case in point: The ludicrously extravagant gyms in America’s larger universities are home to rock climbing walls, corkscrew tracks, rooftop gardens, and a lazy river. Schools have billions to invest in housing and other on-campus projects. Schools have millions (or in some cases “mere” hundreds of thousands) to pay administrators.  Yet schools can’t find the money to hire more full-time professors. If one follows the money, it’s clear that colleges view education as tertiary. The rigor of a university’s courses doesn’t attract the awe of doe-eyed high school seniors. Lavish dorms and other luxuries do.

Despite such execrable circumstances, professors trek onward and try to educate students as best they can. But how good can education provided by overworked, underpaid adjuncts be? The professors Salon spoke to had varying opinions.

Benay Blend has taught for over 30 years. For 10 of those years, she worked in a bookstore for $7.50 an hour because she needed the extra income.

“I don’t want to fall into the trap that the media use that using adjunct labor means poor education,” Blend said. “I have a PhD. I’ve published probably more than full-time people where I teach. I’ve been teaching for 30 years. I’m a good teacher.”

“On the whole, teaching quality by adjuncts is excellent,” said Kane Faucher, a six-year adjunct. “But many are not available for mentoring and consultation because they have to string together so many courses just to reach or possibly exceed the poverty line. This means our resources are stretched too thinly as a matter of financial survival, and there are many adjuncts who do not even have access to a proper office, which means they work out of coffee shops and cars.”

The anonymous adjunct professor from Indiana expressed a similar sentiment.

“I definitely don’t want to go down the road of ‘Adjunct professors, because of the way we’re handled, are not able to be effective teachers.’ I think some of us are more effective teachers than people who get paid a lot more than we do. Some of us aren’t for really good reasons which have to do with not having the resources. I mean if you’re working at three different colleges, how can you possibly be there?”

Ann Kottner, an adjunct professor and activist, agreed.

“The real problem with the adjunct market right now is that it cheats students of the really outstanding educations they should be getting,” she said. “They’re paying a lot of money for these educations and they’re not getting them. And it’s not because they have bad instructors, it’s because their instructors are not supported to do the kind of work they can do.”

The situation reached such a flashpoint that Kottner and several colleagues (some of which spoke to Salon for this article) penned a petition to the US Department of Labor’s Wage and Hour Division. The petition calls for “an investigation into the labor practices of our colleges and universities in the employment of contingent faculty.” Ana Foryes Tamayo has a petition as well, this one to the US Secretary of Education, Arne Duncan. They both have over 8,000 signatories.

When asked about the petition’s impact, Kottner said it was “just one tactic in the whole sheath of a rising adjunct response to contingency.” Other tools included unionization, which is difficult in many states. Kottner said the most powerful force was information. “I think our biggest weapon now is basically making the public aware of what their tuition dollars are not paying for, and that is professor salaries and professor security.”

When asked if there was any hope about the future, no consensus was reached among the adjuncts Salon spoke with. Some believed things would never change. Others thought the tide would turn if enough people knew how far the professoriat had fallen.

http://www.salon.com/2014/09/21/professors_on_food_stamps_the_shocking_true_story_of_academia_in_2014/?source=newsletter