How to Stop Time

Credit Viktor Hachmang

IN the unlikely event that we could ever unite under the banner of a single saint, it might just be St. Expeditus. According to legend, when the Roman centurion decided to convert to Christianity, the Devil appeared in the form of a crow and circled above him crying “cras, cras” — Latin for “tomorrow, tomorrow.” Expeditus stomped on the bird and shouted victoriously, “Today!” For doing so, Expeditus achieved salvation, and is worshiped as the patron saint of procrastinators. Sometimes you see icons of him turned upside down like an hourglass in the hope that he’ll hurry up and help you get your work done so he can be set right-side up again. From job-seekers in Brazil to people who run e-commerce sites in New Orleans, Expeditus is adored not just for his expediency, but also for his power to settle financial affairs. There is even a novena to the saint on Facebook.

Expeditus was martyred in A.D. 303, but was resurrected around the time of the Industrial Revolution, as the tempo of the world accelerated with breathtaking speed. Sound familiar? Today, as the pace of our lives quickens and the demands placed on us multiply, procrastination is the archdemon many of us wrestle with daily. It would seem we need Expeditus more than ever.

Photo

Credit Viktor Hachmang

“Procrastination, quite frankly, is an epidemic,” declares Jeffery Combs, the author of “The Procrastination Cure,” just one in a vast industry of self-help books selling ways to crush the beast. The American Psychological Association estimates that 20 percent of American men and women are “chronic procrastinators.” Figures place the amount of money lost in the United States to procrastinating employees at trillions of dollars a year.

A recent infographic in The Economist revealed that in the 140 million hours humanity spent watching “Gangnam Style” on YouTube two billion times, we could have built at least four more (desperately needed) pyramids at Giza. Endless articles pose the question of why we procrastinate, what’s going wrong in the brain, how to overcome it, and the fascinating irrationality of it all.

But if procrastination is so clearly a society-wide, public condition, why is it always framed as an individual, personal deficiency? Why do we assume our own temperaments and habits are at fault — and feel bad about them — rather than question our culture’s canonization of productivity?

I was faced with these questions at an unlikely event this past July — an academic conference on procrastination at the University of Oxford. It brought together a bright and incongruous crowd: an economist, a poetry professor, a “biographer of clutter,” a queer theorist, a connoisseur of Iraqi coffee-shop culture. There was the doctoral student who spoke on the British painter Keith Vaughan, known to procrastinate through increasingly complicated experiments in auto-erotica. There was the children’s author who tied herself to her desk with her shoelaces.

The keynote speaker, Tracey Potts, brought a tin of sugar cookies she had baked in the shape of the notorious loiterer Walter Benjamin. The German philosopher famously procrastinated on his “Arcades Project,” a colossal meditation on the cityscape of Paris where the figure of the flâneur — the procrastinator par excellence — would wander. Benjamin himself fatally dallied in escaping the city ahead of the Nazis. He took his own life, leaving the manuscript forever unfinished, more evidence, it would seem, that no avoidable delay goes unpunished.

As we entered the ninth, grueling hour of the conference, a professor laid out a taxonomy of dithering so enormous that I couldn’t help but wonder: Whatever you’re doing, aren’t you by nature procrastinating from doing something else? Seen in this light, procrastination begins to look a lot like just plain existing. But then along come its foot soldiers — guilt, self-loathing, blame.

Dr. Potts explained how procrastination entered the field as pathological behavior in the mid-20th century. Drawing on the work of the British-born historian Christopher Lane, Dr. Potts directed our attention to a United States War Department bulletin issued in 1945 that chastised soldiers who were avoiding their military duties “by passive measures, such as pouting, stubbornness, procrastination, inefficiency and passive obstructionism.” In 1952, when the American Psychiatric Association assembled the first edition of the Diagnostic and Statistical Manual of Mental Disorders — the bible of mental health used to determine illness to this day — it copied the passage from the cranky military memo verbatim.

And so, procrastination became enshrined as a symptom of mental illness. By the mid-60s, passive-aggressive personality disorder had become a fairly common diagnosis and “procrastination” remained listed as a symptom in several subsequent editions. “Dawdling” was added to the list, after years of delay.

While passive-aggressive personality disorder has been erased from the official portion of the manual, the stigma of slothfulness remains. Many of us, it seems, are still trying to enforce a military-style precision on our intellectual, creative, civilian lives — and often failing. Even at the conference, participants proposed strategies for beating procrastination that were chillingly martial. The economist suggested that we all “take hostages” — place something valuable at stake as a way of negotiating with our own belligerent minds. The children’s author writes large checks out to political parties she loathes, and entrusts them to a relative to mail if she misses a deadline.

All of which leads me to wonder: Are we imposing standards on ourselves that make us mad?

Though Expeditus’s pesky crow may be ageless, procrastination as epidemic — and the constant guilt that goes with it — is peculiar to the modern era. The 21st-century capitalist world, in its never-ending drive for expansion, consecrates an always-on productivity for the sake of the greater fiscal health.

In an 1853 short story Herman Melville gave us Bartleby, the obstinate scrivener and apex procrastinator, who confounds the requests of his boss with his hallowed mantra, “I would prefer not to.” A perfect employee on the surface — he never leaves the office and sleeps at his desk — Bartleby represents a total rebellion against the expectations placed on him by society. Politely refusing to accept money or to remove himself from his office even after he is fired, the copyist went on to have an unexpected afterlife — as hero for the Occupy movement in 2012. “Bartleby was the first laid-off worker to occupy Wall Street,” Jonathan D. Greenberg noted in The Atlantic. Confronted with Bartleby’s serenity and his utter noncompliance with the status quo, his perplexed boss is left wondering whether he himself is the one who is mad.

A month before the procrastination conference, I set myself the task of reading “Oblomov,” the 19th-century Russian novel by Ivan Goncharov about the ultimate slouch, who, over the course of 500 pages, barely moves from his bed, and then only to shift to the sofa. At least that’s what I heard: I failed to make it through more than two pages at a sitting without putting the novel down and allowing myself to drift off. I would carry the heavy book everywhere with me — it was like an anchor into a deep, blissful sea of sleep.

Oblomov could conduct the few tasks he cared to from under his quilt — writing letters, accepting visitors — but what if he’d had an iPhone and a laptop? Being in bed is now no excuse for dawdling, and no escape from the guilt that accompanies it. The voice — societal or psychological — urging us away from sloth to the pure, virtuous heights of productivity has become a sort of birdlike shriek as more individuals work from home and set their own schedules, and as the devices we use for work become alluring sirens to our own distraction. We are now able to accomplish tasks at nearly every moment, even if we prefer not to.

Still, humans will never stop procrastinating, and it might do us good to remember that the guilt and shame of the do-it-tomorrow cycle are not necessarily inescapable. The French philosopher Michel Foucault wrote about mental illness that it acquires its reality as an illness “only within a culture that recognizes it as such.” Why not view procrastination not as a defect, an illness or a sin, but as an act of resistance against the strictures of time and productivity imposed by higher powers? To start, we might replace Expeditus with a new saint.

At the conference, I was invited to speak about the Egyptian-born novelist Albert Cossery, a true icon of the right to remain lazy. In the mid-1940s, Cossery wrote a novel in French, “Laziness in the Fertile Valley,” about a family in the Nile Delta that sleeps all day. Their somnolence is a form of protest against a world forever ruled by tyrants winding the clock. Born in 1913 in Cairo, Cossery grew up in a place that still retained cultural memories of the introduction of Western notions of time, a once foreign concept. It had arrived along with British military forces in the late 19th century. To turn Egypt into a lucrative colony, it needed to run on a synchronized, efficient schedule. The British replaced the Islamic lunar calendar with the Gregorian, preached the values of punctuality, and spread the gospel that time equaled money.

Firm in his belief that time is not as natural or apolitical as we might think, Cossery, in his writings and in his life, strove to reject the very system in which procrastination could have any meaning at all. Until his death in 2008, the elegant novelist, living in Paris, maintained a strict schedule of idleness. He slept late, rising in the afternoons for a walk to the Café de Flore, and wrote fiction only when he felt like it. “So much beauty in the world, so few eyes to see it,” Cossery would say. He was the archetypal flâneur, in the footsteps of Walter Benjamin and Charles Baudelaire, whose verses Cossery would steal for his own poetry when he was a teenager. Rather than charge through the day, storming the gates of tomorrow, his stylized repose was a perch from which to observe, reflect and question whether the world really needs all those things we feel we ought to get done — like a few more pyramids at Giza. And it was idleness that led Cossery to true creativity, dare I say it, in his masterfully unprolific work.

After my talk, someone came up to ask me what I thought was the ideal length of a nap. Saint Cossery was smiling. Already one small battle had been won.

How an Apple mega-deal cost Los Angeles classrooms $1 billion

Rotten to the Core:

Bad business and worse ethics? A scandal is brewing in L.A. over a sketchy intiative to give every student an iPad

 

Rotten to the Core: How an Apple mega-deal cost Los Angeles classrooms $1 billion

Technology companies may soon be getting muddied from a long-running scandal at the Los Angeles Unified School District (LAUSD), the nation’s second-largest system. A year after the cash-strapped district signed a $1 billion contract with Apple to purchase iPads for every student, the once-ballyhooed deal has blown up. Now the mess threatens to sully other vendors from Cambridge to Cupertino.

LAUSD superintendent John Deasy is under fire for his cozy connections to Apple. In an effort to deflect attention and perhaps to show that “everybody else is doing it,” he’s demanded the release of all correspondence between his board members and technology vendors. It promises to be some juicy reading. But at its core, the LAUSD fiasco illustrates just how much gold lies beneath even the dirtiest, most neglected public schoolyard.

As the U.S. starts implementing federal Common Core State Standards, teachers and administrators are being driven to adopt technology as never before. That has set off a scramble in Silicon Valley to grab as much of the $9 billion K-12 market as possible, and Apple, Google, Cisco and others are mud-wrestling to seize a part of it. Deasy and the LAUSD have given us ringside seats to this match, which shows just how low companies will go.

When the Apple deal was announced a year ago, it was touted as the largest ever distribution of computing devices to American students. The Los Angeles Times ran a story accompanied by a photograph of an African-American girl and her classmate, who looked absolutely giddy about their new gadgets. Readers responded to the photo’s idealistic promise — that every child in Los Angeles, no matter their race or socioeconomic background, would have access to the latest technology, and Deasy himself pledged “to provide youth in poverty with tools that heretofore only rich kids have had.” Laudable as it was, that sentiment assumed that technology would by itself save our underfunded schools and somehow balance our inequitable society.



When I heard about the deal, I felt a wave of déjà vu. I had sat in a PTA meeting at a public school listening to a similar, albeit much smaller, proposed deal.  An Apple vendor had approached administrators in a Santa Barbara County school, offering to sell us iPads. The pitch was that we could help propel our kids into the technological age so that they’d be better prepared for the world, and maybe land a nice-paying, high-tech job somewhere down the line. Clearly, a school contract would be great for Apple, giving it a captive group of impressionable 11-year-olds it could then mold into lifelong customers.

But parents had to raise a lot of money to seal this deal. “Is Apple giving us a discount?” asked a fellow PTA member. No, we were told. Apple doesn’t give discounts, not even to schools. In the end, we decided to raise funds for an athletics program and some art supplies instead.

To be fair, PTA moms and dads are no match at the bargaining table for the salespeople at major companies like Google, and Hewlett-Packard. But the LAUSD, with its $6.8 billion budget, had the brains and muscle necessary to negotiate something valuable for its 655,000 students. That was the hope, at least.

Alas, problems began to appear almost immediately. First, some clever LAUSD students hacked the iPads and deleted security filters so they could roam the Internet freely and watch YouTube videos. Then, about $2 million in iPads and other devices went “missing.” Worse was the discovery that the pricey curriculum software, developed by Pearson Education Corp., wasn’t even complete. And the board looked foolish when it had to pay even more money to buy keyboards for iPads so that students could actually type out their reports.

Then, there was the deal itself. Whereas many companies extend discounts to schools and other nonprofits, Apple usually doesn’t, said George Michaels, executive director of Instructional Development at University of California at Santa Barbara. “Whatever discounts Apple gives are pretty meager.” The Chronicle of Philanthropy has noted Apple’s stingy reputation, and CEO Tim Cook has been trying to change the corporation’s miserly ways by giving $50 million to a local hospital and $50 million to an African nonprofit.

But the more we learned about the Apple “deal,” the more the LAUSD board seemed outmaneuvered. The district had bought iPad 4s, which have since been discontinued, but Apple had locked the district into paying high prices for the old models. LAUSD had not checked with its teachers or students to see what they needed or wanted, and instead had forced its end users to make the iPads work. Apple surely knew that kids needed keypads to write reports, but sold them just part of what they needed.

Compared with similar contracts signed by other districts, Apple’s deal for Los Angeles students looked crafty, at best. Perris Union High School District in Riverside County, for example, bought Samsung Chromebooks for only $344 per student. And their laptop devices have keyboards and multiple input ports for printers and thumb drives. The smaller Township High School District 214 in Illinois bought old iPad 2s without the pre-loaded, one-size-fits-all curriculum software. Its price: $429 per student.

But LAUSD paid Apple a jaw-dropping $768 per student, and LAUSD parents were not happy. As Manel Saddique wrote on a social media site: “Btw, thanks for charging a public school district more than the regular consumer price per unit, Apple. Keep it classy…”

By spring there was so much criticism about the purchase that the Los Angeles Times filed a request under the California Public Records Act to obtain all emails and records tied to the contract. What emerged was the image of a smoky backroom deal.

Then-Deputy Superintendent Jaime Aquino had once worked at Pearson, the curriculum developer, and knew the players. It turned out that Aquino and Deasy had started talking with Apple and Pearson two years before the contract was approved, and a full year before it was put out to bid. The idea behind a public bidding process is that every vendor is supposed to have the same opportunity to win a job, depending on their products, delivery terms and price. But emails show that Deasy was intent on embracing just one type of device: Apple’s.

Aquino went so far as to appear in a promotional video for iPads months before the contracts were awarded. Dressed in a suit and tie, the school official smiled for the camera as he talked about how Apple’s product would lead to “huge leaps in what’s possible for students” and would “phenomenally . . . change the landscape of education.” If other companies thought they had a shot at nabbing the massive contract from the influential district, this video must have disabused them of that idea.

At one point, Aquino was actually coaching software devloper Pearson on what to do: “[M]ake sure that your bid is the lower one,” he wrote. Meanwhile, Deasy was emailing Pearson CEO Marjorie Scardino, and effusively recounting his visit with Apple’s CEO. “I wanted to let you know I had an excellent meeting with Tim at Apple last Friday … The meeting went very well and he was fully committed to being a partner … He was very excited.”

If you step back from the smarmy exchanges, a bigger picture emerges. Yes, LAUSD is grossly mismanaged and maybe even dysfunctional. But corporations like Apple don’t look so good, either. Google, Microsoft, Facebook, Apple, Hewlett Packard — the companies that are cashing in on our classroom crisis are the same ones that helped defund the infrastructure that once made public schools so good. Sheltering billions of dollars from federal taxes may be great for the top 10 percent of Americans, who own 90 percent of the stock in these corporations. But it’s a catastrophe for the teachers, schools and universities that helped develop their technology and gave the companies some of its brightest minds. In the case of LAUSD, Apple comes across as cavalier about the problem it’s helped create for low-income students, and seems more concerned with maximizing its take from the district.

But the worst thing about this scandal is what it’s done to the public trust. The funds for this billion-dollar boondoggle were taken from voter-approved school construction and modernization bonds — bonds that voters thought would be used for physical improvements. At a time when LAUSD schools, like so many across the country, are in desperate need of physical repairs, from corroded gas lines to broken play structures, the Apple deal has cast a shadow over school bonds. Read the popular “Repairs Not iPads” page on Facebook and parents’ complaints about the lack of air conditioning, librarians and even toilet paper in school bathrooms. Sadly, replacing old fixtures and cheap trailers with new plumbing and classrooms doesn’t carry the kind of cachet for ambitious school boards as does, say, buying half-a-million electronic tablets. As one mom wrote: “Deasy has done major long-term damage because not one person will ever vote for any future bond measures supporting public schools.”

Now, the Apple deal is off, although millions of dollars have already been spent. An investigation into the bidding process is underway and there are cries to place Deasy in “teacher jail,” a district policy that keeps teachers at home while they’re under investigation. And LAUSD students, who are overwhelmingly Hispanic and African-American, have once again been given the short end of the stick. They were promised the sort of “tools that heretofore only rich kids have had,” and will probably not see them for several years, if ever. The soured Apple deal just adds to the sense of injustice that many of these students already see in the grown-up world.

Deasy contends that that he did nothing wrong. In a few weeks, the public official will get his job performance review. In the meantime, he’s called for the release of all emails and documents written between board members and other Silicon Valley and corporate education vendors. The heat in downtown Los Angeles is spreading to Northern California and beyond, posing a huge political problem for not just Deasy but for Cook and other high-tech captains.

But at the bottom of this rush to place technology in every classroom is the nagging feeling that the goal in buying expensive devices is not to improve teachers’ abilities, or to lighten their load. It’s not to create more meaningful learning experiences for students or to lift them out of poverty or neglect. It’s to facilitate more test-making and profit-taking for private industry, and quick, too, before there’s nothing left.

 

1 in 5 Americans still can’t find a job

people sitting on a bench waiting

A new report about the lingering effects of the Great Recession finds that about 20 percent of Americans who lost their job during the last five years are still unemployed and looking for work.

Approximately half of the laid-off workers who found work were paid less in their new positions; one in four say their new job was only temporary.

“While the worst effects of the Great Recession are over for most Americans, the brutal realities of diminished living standards endure for the three million American workers who remain jobless years after they were laid off,” says Carl Van Horn, a professor at Rutgers who co-authored the study with Professor Cliff Zukin.

“These long-term unemployed workers have been left behind to fend for themselves as they struggle to pull their lives back together.”

As of last August, 3 million Americans—nearly one in three unemployed workers—have been unemployed for more than six months, and more than 2 million Americans have been out of work for more than a year, the researchers say.

While the percentage of the long-term unemployed (workers who have been unemployed for more than six months) has declined from 46 percent in 2010, it is still above the 26 percent level experienced in the worst previous recession in 1983.

Job training

The national study found that only one in five of the long-term unemployed received help from a government agency when looking for a job; only 22 percent enrolled in a training program to develop skills for a new job; and 60 percent received no government assistance beyond unemployment benefits.

Nearly two-thirds of Americans support increasing funds for long-term education and training programs, and greater spending on roads and highways in order to assist unemployed workers.

For the survey, the Heldrich Center interviewed a representative sample of 1,153 Americans, including 394 unemployed workers looking for work, 389 Americans who have been unemployed for more than six months or who were unemployed for a period of more than six months at some point in the last five years, and 463 individuals who currently have jobs.

Other findings

  • More than seven in 10 long-term unemployed say they have less in savings and income than they did five years ago.
  • More than eight in 10 of the long-term unemployed rate their personal financial situation negatively as only fair or poor.
  • More than six in 10 unemployed and long-term unemployed say they experienced stress in family relationships and close friendships during their time without a job.
  • Fifty-five percent of the long-term unemployed say they will need to retire later than planned because of the recession, while 5 percent say the weak economy forced them into early retirement.
  • Nearly half of the long-term unemployed say it will take three to 10 years for their families to recover financially. Another one in five say it will take longer than that or that they will never recover.

Source: Rutgers

http://www.futurity.org/unemployed-americans-great-recession-772342/?utm_source=Futurity+Today&utm_campaign=522dd455e5-September_26_20149_26_2014&utm_medium=email&utm_term=0_e34e8ee443-522dd455e5-205990505

How our botched understanding of ‘science’ ruins everything

Intellectuals of all persuasions love to claim the banner of science. A vanishing few do so properly.
We're not doing science any favors if we don't properly understand it.
We’re not doing science any favors if we don’t properly understand it. (iStock)
 

Here’s one certain sign that something is very wrong with our collective mind: Everybody uses a word, but no one is clear on what the word actually means.

One of those words is “science.”

Everybody uses it. Science says this, science says that. You must vote for me because science. You must buy this because science. You must hate the folks over there because science.

Look, science is really important. And yet, who among us can easily provide a clear definition of the word “science” that matches the way people employ the term in everyday life?

So let me explain what science actually is. Science is the process through which we derive reliable predictive rules through controlled experimentation. That’s the science that gives us airplanes and flu vaccines and the Internet. But what almost everyone means when he or she says “science” is something different.

To most people, capital-S Science is the pursuit of capital-T Truth. It is a thing engaged in by people wearing lab coats and/or doing fancy math that nobody else understands. The reason capital-S Science gives us airplanes and flu vaccines is not because it is an incremental engineering process but because scientists are really smart people.

In other words — and this is the key thing — when people say “science”, what they really mean is magic or truth.

A little history: The first proto-scientist was the Greek intellectual Aristotle, who wrote many manuals of his observations of the natural world and who also was the first person to propose a systematic epistemology, i.e., a philosophy of what science is and how people should go about it. Aristotle’s definition of science became famous in its Latin translation as: rerum cognoscere causas, or, “knowledge of the ultimate causes of things.” For this, you can often see in manuals Aristotle described as the Father of Science.

The problem with that is that it’s absolutely not true. Aristotelian “science” was a major setback for all of human civilization. For Aristotle, science started with empirical investigation and then used theoretical speculation to decide what things are caused by.

What we now know as the “scientific revolution” was a repudiation of Aristotle: science, not as knowledge of the ultimate causes of things but as the production of reliable predictive rules through controlled experimentation.

Galileo disproved Aristotle’s “demonstration” that heavier objects should fall faster than light ones by creating a subtle controlled experiment (contrary to legend, he did not simply drop two objects from the Tower of Pisa). What was so important about this Galileo Moment was not that Galileo was right and Aristotle wrong; what was so important was how Galileo proved Aristotle wrong: through experiment.

This method of doing science was then formalized by one of the greatest thinkers in history, Francis Bacon. What distinguishes modern science from other forms of knowledge such as philosophy is that it explicitly forsakes abstract reasoning about the ultimate causes of things and instead tests empirical theories through controlled investigation. Science is not the pursuit of capital-T Truth. It’s a form of engineering — of trial by error. Scientific knowledge is not “true” knowledge, since it is knowledge about only specific empirical propositions — which is always, at least in theory, subject to further disproof by further experiment. Many people are surprised to hear this, but the founder of modern science says it. Bacon, who had a career in politics and was an experienced manager, actually wrote that scientists would have to be misled into thinking science is a pursuit of the truth, so that they will be dedicated to their work, even though it is not.

Why is all this ancient history important? Because science is important, and if we don’t know what science actually is, we are going to make mistakes.

The vast majority of people, including a great many very educated ones, don’t actually know what science is.

If you ask most people what science is, they will give you an answer that looks a lot like Aristotelian “science” — i.e., the exact opposite of what modern science actually is. Capital-S Science is the pursuit of capital-T Truth. And science is something that cannot possibly be understood by mere mortals. It delivers wonders. It has high priests. It has an ideology that must be obeyed.

This leads us astray. Since most people think math and lab coats equal science, people call economics a science, even though almost nothing in economics is actually derived from controlled experiments. Then people get angry at economists when they don’t predict impending financial crises, as if having tenure at a university endowed you with magical powers. Countless academic disciplines have been wrecked by professors’ urges to look “more scientific” by, like a cargo cult, adopting the externals of Baconian science (math, impenetrable jargon, peer-reviewed journals) without the substance and hoping it will produce better knowledge.

Because people don’t understand that science is built on experimentation, they don’t understand that studies in fields like psychology almost never prove anything, since only replicated experiment proves something and, humans being a very diverse lot, it is very hard to replicate any psychological experiment. This is how you get articles with headlines saying “Study Proves X” one day and “Study Proves the Opposite of X” the next day, each illustrated with stock photography of someone in a lab coat. That gets a lot of people to think that “science” isn’t all that it’s cracked up to be, since so many studies seem to contradict each other.

This is how you get people asserting that “science” commands this or that public policy decision, even though with very few exceptions, almost none of the policy options we as a polity have have been tested through experiment (or can be). People think that a study that uses statistical wizardry to show correlations between two things is “scientific” because it uses high school math and was done by someone in a university building, except that, correctly speaking, it is not. While it is a fact that increased carbon dioxide in the atmosphere leads, all else equal, to higher atmospheric temperatures, the idea that we can predict the impact of global warming — and anti-global warming policies! — 100 years from now is sheer lunacy. But because it is done using math by people with tenure, we are told it is “science” even though by definition it is impossible to run an experiment on the year 2114.

This is how you get the phenomenon of philistines like Richard Dawkins and Jerry Coyne thinking science has made God irrelevant, even though, by definition, religion concerns the ultimate causes of things and, again, by definition, science cannot tell you about them.

Neil DeGrasse Tyson (Facebook.com/COSMOSOnTV)

You might think of science advocate, cultural illiterate, mendacious anti-Catholic propagandist, and possible serial fabulist Neil DeGrasse Tyson and anti-vaccine looney-toon Jenny McCarthy as polar opposites on a pro-science/anti-science spectrum, but in reality they are the two sides of the same coin. Both of them think science is like magic, except one of them is part of the religion and the other isn’t.

The point isn’t that McCarthy isn’t wrong on vaccines. (She is wrong.) The point is that she is the predictable result of a society that has forgotten what “science” means. Because we lump many different things together, there are bits of “science” that aren’t actual science that get lumped into society’s understanding of what science is. It’s very profitable for those who grab some of the social prestige that accrues to science, but it means we live in a state of confusion.

It also means that for all our bleating about “science” we live in an astonishingly unscientific and anti-scientific society. We have plenty of anti-science people, but most of our “pro-science” people are really pro-magic (and therefore anti-science).

This bizarre misunderstanding of science yields the paradox that even as we expect the impossible from science (“Please, Mr Economist, peer into your crystal ball and tell us what will happen if Obama raises/cuts taxes”), we also have a very anti-scientific mindset in many areas.

For example, our approach to education is positively obscurantist. Nobody uses rigorous experimentation to determine better methods of education, and someone who would dare to do so would be laughed out of the room. The first and most momentous scientist of education, Maria Montessori, produced an experimentally based, scientific education method that has been largely ignored by our supposedly science-enamored society. We have departments of education at very prestigious universities, and absolutely no science happens at any of them.

Our approach to public policy is also astonishingly pre-scientific. There have been almost no large-scale truly scientific experiments on public policy since the welfare randomized field trials of the 1990s, and nobody seems to realize how barbaric this is. We have people at Brookings who can run spreadsheets, and Ezra Klein can write about it and say it proves things, we have all the science we need, thank you very much. But that is not science.

Modern science is one of the most important inventions of human civilization. But the reason it took us so long to invent it and the reason we still haven’t quite understood what it is 500 years later is it is very hard to be scientific. Not because science is “expensive” but because it requires a fundamental epistemic humility, and humility is the hardest thing to wring out of the bombastic animals we are.

But until we take science for what it really is, which is both more and less than magic, we will still have one foot in the barbaric dark.

(Top photo: Leemage/Corbis)

http://theweek.com/article/index/268360/how-our-botched-understanding-of-science-ruins-everything

The witch-hunting of Steven Salaita and the new McCarthyism

http://www.trbimg.com/img-540f6523/turbine/chi-steven-salaita-illinois-protest-20140909

23 September 2014

The political victimization of Steven Salaita, whose appointment as a tenured professor at the University of Illinois Urbana-Champaign (UIUC) was revoked because he tweeted outraged protests against the slaughter of Palestinian civilians in Gaza, is a chilling attack on core democratic rights, including freedom of speech and academic freedom.

The university administration and the University of Illinois Board of Trustees have justified the witch-hunt against Salaita in the name of “democracy,” “civility” and “pluralism.” This not only expresses the hypocrisy that pervades these institutions, it reflects the evisceration of all democratic principles and mechanisms within American capitalist society as a whole.

The termination of Salaita’s appointment as a tenured professor of American Indian Studies at UIUC came after he had given up his position at Virginia Tech and moved with his wife, who also left her job, and young child to Illinois. The pretext for his removal was a series of tweets he had sent in the midst of the one-sided Israeli war on the Palestinian population of Gaza.

A campaign initiated by the political right and the Zionist lobby to twist what Salaita said in these tweets in order to smear him as an anti-Semite was embraced by the university administration and some ostensibly liberal representatives of academia. The most frequently cited tweet, written by Salaita on July 19 as Israel escalated its murderous violence against Gaza, stated, “Zionists: transforming ‘antisemitism’ from something horrible into something honorable since 1948.”

The tweet was ripped out of its context of a series of statements, including the clarifications that it was the Zionists who had distorted the meaning of the term “anti-semitism” by equating it with something as “honorable” as “deplor[ing] colonization, land theft and child murder,” and that this outlook served to “cheapen anti-Semitism by likening it to principled stands against state violence.”

He further wrote: “My stand is fundamentally one of acknowledging and countering the horror of antisemitism.” In others tweets, he added that he supported Gaza because “I believe that Jewish and Arab children are equal in the eyes of God” and that he found himself “in solidarity with many Jews and in disagreement with many Arabs.”

Only a grotesque and willful distortion could attribute to Salaita support for anti-Semitism based on these messages. This, of course, is precisely the specialty of the American right and the Zionist lobby, which proceeded to do just that.

The right-wing web site the Daily Caller gave prominence to these slanders, while the Simon Wiesenthal Center—for which there is no greater crime than demanding equal rights for Palestinians and Israelis—demanded that the university rescind Salaita’s appointment, describing him as “nothing more than a baseless anti-Semite.”

That this noxious political alliance of right-wing ideologues and rabid Zionist anti-Palestinians found powerful backing has been established with the exposure of emails from prominent wealthy donors to the university threatening to withhold funding if the administration failed to carry through the politically motivated victimization of Salaita.

Whatever Salaita tweeted had no bearing on his appointment as a professor, and its distortion and use to deny him employment was a vicious attack on his democratic right to free speech, as well as a direct assault on academic freedom. As Salaita correctly responded, the university’s action was based on “a highly subjective and sprawling standard that can be used to attack faculty who espouse unpopular or unconventional views.”

The university proceeded to act on just such a perverse “standard” and then defended it as a blow for “civility” and even “democracy.”

Following a September 11 vote by the Illinois Board of Trustees to uphold the decision to victimize Salaita, the board’s chairman, Christopher Kennedy, insisted that anyone “with an open mind” would be convinced “we did the right thing, ethically and procedurally.”

Kennedy, the son of the assassinated senator and Democratic presidential candidate Robert Kennedy, is a political appointee of Illinois’ Democratic Governor Pat Quinn. He has acted as a fundraiser for Barack Obama and other Democratic candidates.

Referring to the opinions Salaita expressed in his tweets, Kennedy declared: “There can be no place for that in our democracy, and therefore, there will be no place for it in our university.”

If constitutionally protected speech criticizing the policies and actions of Washington’s key Middle East ally, Israel, can have no place “in our democracy” or “in our university,” what other views can be outlawed and suppressed? Why not opposition to imperialist war, or criticism and questioning of the “war on terrorism” pretexts being used to drag the American people into another predatory military intervention in the Middle East based on lies?

The proscription of views as having no place “in our democracy” has a long and ignoble history in the United States, reaching its apogee during the McCarthyite anti-communist witch hunts of the 1950s, which half a century later still cast a dark shadow over American political and intellectual life.

The revival of these anti-democratic methods today in cases like that of Steven Salaita is deeply rooted in the degeneration of American capitalism, expressed most sharply in the poisonous combination of unrestrained militarist violence abroad and unprecedented social inequality and monopolization of wealth at home.

This is why the new McCarthyism enjoys the support not only of the political right and Zionism, but also ostensibly liberal Democrats like Kennedy and other supporters of a president who has arrogated himself to the “right” to order the assassination of US citizens, while overseeing a massive illegal spying operation that sweeps up virtually all electronic communications of citizens of the US and countries all over the world.

The defense of fundamental democratic rights today is inseparable from the development of a struggle against war and the independent mobilization of the working class in defense of its social and political rights. As part of this struggle, the demand must be raised for an end to the victimization of Steven Salaita and a halt to all attacks on academic and intellectual freedom, which are bound up with the preparations for police state rule.

Bill Van Auken

For-Profit Colleges Are America’s Dream Crushers


Students who attend get education with a debt sentence.

Photo Credit: Stephanie Zieber/Shutterstock.com

Imagine corporations that intentionally target low-income single mothers as ideal customers. Imagine that these same companies claim to sell tickets to the American dream — gainful employment, the chance for a middle class life. Imagine that the fine print on these tickets, once purchased, reveals them to be little more than debt contracts, profitable to the corporation’s investors, but disastrous for its customers. And imagine that these corporations receive tens of billions of dollars in taxpayer subsidies to do this dirty work. Now, know that these corporations actually exist and are universities.

Over the last three decades, the price of a year of college has increased bymore than 1,200%. In the past, American higher education has always been associated with upward mobility, but with student loan debt quadrupling between 2003 and 2013, it’s time to ask whether education alone can really move people up the class ladder. This is a question of obvious relevance for low-income students and students of color.

As Cornell professor Noliwe Rooks and journalist Kai Wright have reported, black college enrollment has increased at nearly twice the rate of white enrollment in recent years, but a disproportionate number of those African-American students end up at for-profit schools. In 2011, two of those institutions, the University of Phoenix (with physical campuses in 39 states and massive online programs) and the online-only Ashford University, produced more black graduates than any other institutes of higher education in the country. Unfortunately, a recent survey by economist Rajeev Darolia shows that for-profit graduates fare little better on the job market than job seekers with high school degrees; their diplomas, that is, are a net loss, offering essentially the same grim job prospects as if they had never gone to college, plus a lifetime debt sentence.

Many students who enroll in such colleges don’t realize that there is a difference between for-profit, public, and private non-profit institutions of higher learning. All three are concerned with generating revenue, but only the for-profit model exists primarily to enrich its owners. The largest of these institutions are often publicly traded, nationally franchised corporations legally beholden to maximize profit for their shareholders before maximizing education for their students. While commercial vocational programs have existed since the nineteenth century, for-profit colleges in their current form are a relatively new phenomenon that began to boom with a series of initial public offerings in the 1990s, followed quickly by deregulation of the sector as the millennium approached. Bush administration legislation then weakened government oversight of such schools, while expanding their access to federal financial aid, making the industry irresistible to Wall Street investors.

While the for-profit business model has generally served investors well, it has failed students. Retention rates are abysmal and tuitions sky-high. For-profit colleges can be up to twice as expensive as Ivy League universities, and routinely cost five or six times the price of a community college education. The Medical Assistant program at for-profit Heald College in Fresno, California, costs $22,275. A comparable program at Fresno City College costs $1,650. An associate degree in paralegal studies at Everest College in Ontario, California, costs $41,149, compared to $2,392 for the same degree at Santa Ana College, a mere 30-minute drive away.

Exorbitant tuition means students, who tend to come from poor backgrounds, have to borrow from both the government and private sources, including Sallie Mae (the country’s largest originator, servicer, and collector of student loans) and banks like Chase and Wells Fargo. A whopping 96% of studentswho manage to graduate from for-profits leave owing money, and they typically carry twice the debt load of students from more traditional schools.

Public funds in the form of federal student loans has been called the “lifeblood” of the for-profit system, providing on average 86% of revenues. Such schools now enroll around 10% of America’s college students, but take in more than a quarter of all federal financial aid — as much as $33 billion in a single year. By some estimates it would cost less than half that amount todirectly fund free higher education at all currently existing two- and four-year public colleges. In other words, for-profit schools represent not a “market solution” to increasing demand for the college experience, but the equivalent of a taxpayer-subsidized subprime education.

Pushing the Hot Button, Poking the Pain

The mantra is everywhere: a college education is the only way to climb out of poverty and create a better life. For-profit schools allow Wall Street investors and corporate executives to cash in on this faith.

Publicly traded schools have been shown to have profit margins, on average, of nearly 20%. A significant portion of these taxpayer-sourced proceeds are spent on Washington lobbyists to keep regulations weak and federal money pouring in. Meanwhile, these debt factories pay their chief executive officers $7.3 million in average yearly compensation. John Sperling, architect of the for-profit model and founder of the University of Phoenix, which serves more students than the entire University of California system or all the Ivy Leagues combined, died a billionaire in August.

Graduates of for-profit schools generally do not fare well. Indeed, they rarely find themselves in the kind of work they were promised when they enrolled, the kind of work that might enable them to repay their debts, let alone purchase the commodity-cornerstones of the American dream like a car or a home.

In the documentary “College Inc.,” produced by PBS’s investigative series Frontline, three young women recount how they enrolled in a nursing program at Everest College on the promise of $25-$35 an hour jobs on graduation. Course work, however, turned out to consist of visits to the Museum of Scientology to study “psychiatrics” and visits to a daycare center for their “pediatrics rotation.” They each paid nearly $30,000 for a 12-month program, only to find themselves unemployable because they had been taught nearly nothing about their chosen field.

In 2010, an undercover investigation by the Government Accountability Office tested 15 for-profit colleges and found that every one of them “made deceptive or otherwise questionable statements” to undercover applicants. These recruiting practices are now under increasing scrutiny from 20 state attorneys general, Senate investigators, and the Consumer Financial Protection Bureau (CFPB), amid allegations that many of these schools manipulate the job placement statistics of their graduates in the most cynical of ways.

The Iraq and Afghanistan Veterans of America, an organization that offers support in health, education, employment, and community-building to new veterans, put it this way in August 2013: “Using high-pressure sales tactics and false promises, these institutions lure veterans into enrolling into expensive programs, drain their post-9/11 GI Bill education benefits, and sign up for tens of thousands of dollars in loans. The for-profits take in the money but leave the students with a substandard education, heavy student loan debt, non-transferable credits, worthless degrees, or no degrees at all.”

Even President Obama has spoken out against instances where for-profit colleges preyed upon troops with brain damage: “These Marines had injuries so severe some of them couldn’t recall what courses the recruiter had signed them up for.”

As it happens, recruiters for such schools are manipulating more than statistics. They are mining the intersections of class, race, gender, inequality, insecurity, and shame to hook students. “Create a sense of urgency. Push their hot button. Don’t let the student off the phone. Dial, dial, dial,” a director of admissions at Argosy University, which operates in 23 states and online, told his enrollment counselors in an internal email.

A training manual for recruiters at ITT Tech, another multi-state and virtual behemoth, instructed its employees to “poke the pain a bit and remind them who else is depending on them and their commitment to a better future.”  It even included a “pain funnel” — that is, a visual guide to help recruiters exploit prospective students’ vulnerabilities. Pain was similarly a theme at Ashford University, where enrollment advisors were told by their superiors to “dig deep” into students’ suffering to “convince them that a college degree is going to solve all their problems.”

An internal document from Corinthian Colleges, Inc. (owner of Everest, Heald, and Wyotech colleges) specified that its target demographic is “isolated,” “impatient” individuals with “low self-esteem.”  They should have “few people in their lives who care about them and be stuck in their lives, unable to imagine a future or plan well.”

These recruiting strategies are as well funded as they are abhorrent. When an institution of higher learning is driven primarily by the needs of its shareholders, not its students, the drive to get “asses in classes” guarantees that marketing budgets will dwarf whatever is spent on faculty and instruction. According to David Halperin, author of Stealing America’s Future: How For-Profit Colleges Scam Taxpayers and Ruin Student’s Lives, “The University of Phoenix has spent as much as $600 million a year on advertising; it has regularly been Google’s largest advertiser, spending $200,000 a day.”

At some schools, the money put into the actual education of a single student has been as low as $700 per year. The Senate’s Health, Education, Labor, and Pensions Committee revealed that 30 of the for-profit industry’s biggest players spent $4.2 billion — or 22.7% of their revenue — on recruiting and marketing in 2010.

Subprime Schools, Swindled Students

In profit paradise, there are nonetheless signs of trouble. Corinthian College Inc., for instance, is under investigation by several state and federal agencies for falsifying job-placement rates and lying to students in marketing materials. In June, the Department of Education discovered that the company was on the verge of collapse and began supervising a search for buyers for its more than 100 campuses and online operations. In this “unwinding process,” some Corinthian campuses have already shut down. To make matters worse, this month the Consumer Financial Protection Bureau announced a $500 million lawsuit accusing Corinthian of running a “predatory lending scheme.”

As the failure of Corinthian unfolds, those who understood it to be a school — namely, its students — have been left in the lurch. Are their hard-earned degrees and credits worthless?  Should those who are enrolled stay put and hope for the storm to pass or jump ship to another institution? Social media reverberate with anxious questions.

Nathan Hornes started the Facebook group “Everest Avengers,” a forum where students who feel confused and betrayed can share information and organize. A 2014 graduate of Everest College’s Ontario, California, branch, Nathan graduated with a 3.9 GPA, a degree in Business Management, and $65,000 in debt. Unable to find the gainful employment Everest promised him, he currently works two fast-food restaurant jobs. Nathan’s dreams of starting a record label and a music camp for inner city kids will be deferred even further into some distant future when his debts come due: a six-month grace period expires in October and Nathan will owe $380 each month on Federal loans alone. “Do I want to pay bills or my loans?” he asks. Corinthian has already threatened to sue him if he fails to make payments.

Asked to explain Corinthian’s financial troubles, Trace Urdan, a market analyst for Wells Fargo Bank, Corinthian’s biggest equity investor, argued that the school attracts “subprime students” who “can be expected — as a group — to repay at levels far lower than most student loans.” And yet, as Corinthian’s financial woes mounted, the corporation stopped paying rent at its Los Angeles campuses and couldn’t pay its own substantial debts to lenders, including Bank of America, from whom it sought a debt waiver.

That Corinthian can request debt waivers from its lenders should give us pause. Who, one might ask, is the proper beneficiary of a debt waiver in this case? No such favors will be done for Nathan Hornes or other former Corinthian students, though they have effectively been led into a debt trap with an expert package of misrepresentations, emotional manipulation, and possibly fraud.

From Bad Apples to a Better System, or Everest Avenged

As is always the case with corporate scandals, Corinthian is now being described as a “bad apple” among for-profits, not evidence of a rotten orchard. The fact is that for-profits like Corinthian exemplify all the contradictions of the free-market model that reformers present as the only solution to the current crisis in higher education: not only are these schools 90% dependent on taxpayer money, but tenure doesn’t exist, there are no faculty unions, most courses are offered online with low overhead costs, and students are treated as “customers.”

It’s also worth remembering that at “public” universities, it is now nearly impossible for working class or even middle class students to graduate without debt. This sad state of affairs — so the common version of the story goes — is the consequence of economic hard-times, which require belt tightening and budget cuts. And so it has come to pass that strapped community colleges are now turning away would-be enrollees who wind up in the embrace of for-profits that proceed to squeeze every penny they can from them and the public purse as well. (All the while, of course, this same tale provides for-profits with a cover: they are offering a public service to a marginalized and needy population no one else will touch.)

The standard narrative that, in the face of shrinking tax revenues, public universities must relentlessly raise tuition rates turns out, however, to be full of holes. As political theorist Robert Meister points out, this version of the story ignores the complicity of university leaders in the process. Many of them were never passive victims of privatization; instead, they saw tuition, not taxpayer funding, as the superior and preferred form of revenue growth.

Beginning in the 1990s, universities, public and private, began working ever more closely with Wall Street, which meant using tuition payments not just as direct revenue but also as collateral for debt-financing. Consider the venerable but beleaguered University of California system: a 2012 report out of its Berkeley branch, “Swapping Our Futures,” shows that the whole system was losing $750,000 each month on interest-rate swaps — a financial product that promised lower borrowing costs, but ended up draining the U.C. system of already-scarce resources.

In the last decade, its swap agreements have cost it over $55 million and could, in the end, add up to a loss of $200 million. Financiers, as the university’s creditors, are promised ever-increasing tuition as the collateral on loans, forcing public schools to aggressively recruit ever more out-of-state students, who pay higher tuitions, and to raise the in-state tuition relentlessly as well, simply to meet debt burdens and keep credit ratings high.

Instead of being the social and economic leveler many believe it to be, American higher education in the twenty-first century too often compounds the problem of inequality through debt-servitude. Referring to student debt, which has by now reached $1.2 trillion, Meister suggests, “Add up the lifetime debt service that former students will pay on $1 trillion, over and above the principal they borrow, and you could run a very good public university system for what we are paying capital markets to fund an ever-worsening one.”

You Are Not a Loan

The big problem of how we finance education won’t be solved overnight. But one group is attempting to provide both immediate aid to students like Nathan Hornes and a vision for rethinking debt as a systemic issue. On September 17th, the Rolling Jubilee, an offshoot of Occupy Wall Street, announced the abolition of a portfolio of debt worth nearly $4 million originating from for-profit Everest College. This granted nearly 3,000 former students no-strings-attached debt relief.

The authors of this article have both been part of this effort. To date, the Rolling Jubilee has abolished nearly $20 million dollars of medical and educational debt by taking advantage of a little-known trade secret: debt is often sold to debt collectors for mere pennies on the dollar. A medical bill that was originally $1,000 might sell to a debt collector for 4% of its sticker price, or $40. This allowed the Rolling Jubilee project to make a multi-million dollar impact with a budget of approximately $700,000 raised in large part through small individual donations.

The point of the Rolling Jubilee is simple enough: we believe people shouldn’t have to go into debt for basic needs. For the last four decades, easy access to credit has masked stagnating wages and crumbling social services, forcing many Americans to debt-finance necessities like college, health care, and housing, while the creditor class has reaped enormous rewards. But while we mean the Jubilee’s acts to be significant, we know it is not a sustainable solution to the problem at hand. There is no way to buy and abolish all the odious debt sloshing around our economy, nor would we want to. Given the way our economy is structured, people would start slipping into the red again the minute their debts were wiped out.

The Rolling Jubilee instead raises a question: If a ragtag group of activists can find a way to provide immediate relief to even a few thousand defrauded students, why can’t the government?

The Consumer Financial Protection Bureau’s lawsuit against Corinthian Colleges, Inc. is a good first step, but it only applies to specific private loans originating after 2011, and it will likely take years to play out. Until it’s resolved, students are still technically on the hook and many will be harassed by unscrupulous debt collectors attempting to extract money from them while they still can. In the meantime, the Department of Education (DOE) — which has far greater purview than the CFPB — is effectively acting as a debt collector for a predatory lender, instead of using its discretionary power to help students. Why didn’t the DOE simply let Corinthian go bankrupt, as often happens to private institutions, and so let the students’ debts become dischargeable?

Such debt discharge is well within the DOE’s statutory powers. When a school under its jurisdiction has broken state laws or committed fraud it is, in fact, mandated to offer debt discharge to students. Yet in Corinthian’s opaque, unaccountable unwinding process, the Department of Education appears to be focused on keeping as many of these predatory “schools” open as possible.

No less troubling, the DOE actually stands to profit off Corinthian’s debt payments, as it does from all federally secured educational loans, regardless of the school they are associated with. Senator Elizabeth Warren has already sounded the alarm about the department’s conflict of interest when it comes to student debt, citing an estimate that the government stands to rake in up to $51 billion dollars in a single year on student loans. As Warren points out, it’s “obscene” for the government to treat education as a profit center.

Can there be any doubt that funds reaped from the repayment of federally backed loans by Corinthian students are especially ill-gotten gains? Nathan Hornes and his fellow students should be the beneficiaries of debt relief, not further dispossession.

Unless people agitate, no reprieve will be offered. Instead there may be slaps on the wrist for a few for-profit “bad apples,” with policymakers presenting possible small reductions in interest rates or income-based payments for student borrowers as major breakthroughs.

We need to think bigger. There is an old banking adage: if you owe the bank $1,000, the bank owns you; if you owe the bank $1 million, you own the bank. Individually, student debt is an incapacitating burden. But as Nathan and others are discovering, as a premise for collective action, it can offer a new kind of leverage. Debt collectives, effectively debtors’ unions, may be the next stage of anti-austerity organizing. Collective action offers many possibilities for building power against creditors through collective bargaining, including the power to threaten a debt strike. Where for-profits prey on people’s vulnerability, isolation, and shame, debt collectives would nurture feelings of strength, solidarity, and outrage.

Those who profit from education fear such a transformation, and understandably so. “We ask students to make payments while in school to help them develop the discipline and practice of repaying their federal and other loan obligations,” a Corinthian Colleges spokesman said in response to the news of CFPB’s lawsuit.

It’s absurd: a single mother working two jobs and attending online classes to better her life is discipline personified, even if she can’t always pay her loans on time. The executives and investors living large off her financial aid are the ones who need to be taught a lesson. Perhaps we should collectively demand that as part of their punishment these predators take a course in self-discipline taught by their former students.

Hannah Appel is a mother, activist, and assistant professor of anthropology at UCLA. Her work looks at the everyday life of capitalism and the economic imagination. She has been active with Occupy Wall Street since 2011.

 
Astra Taylor is a writer and documentary filmmaker. Her latest film is Examined Life: Excursions With Contemporary Thinkers, now available with a companion book, published by The New Press.

Eugene O’Neill on Happiness, Hard Work, and Success in a Letter to His Unmotivated Young Son

by

“Any fool knows that to work hard at something you want to accomplish is the only way to be happy.”

By the time he was fifty, playwright Eugene O’Neill had just about every imaginable cultural accolade under his belt, including three Pulitzers and a Nobel Prize. But the very tools that ensured his professional success — dogged dedication to his work, an ability to block out any distraction, razor-sharp focus on his creative priorities — rendered his personal life on the losing side of a tradeoff. Thrice married, he fathered three children with his first two wives. His youngest son, Shane, was a sweet yet troubled boy who worshipped his father but failed to live up to his own potential.

In the summer of 1939, as O’Neill completed his acclaimed play The Iceman Cometh, Shane was expelled from yet another school. Frustrated with the boy’s track record of such dismissals over the course of his academic career, O’Neill sent his 19-year-old son a magnificent letter epitomizing tough love, found in Posterity: Letters of Great Americans to Their Children (public library) — the wonderful anthology that gave us Albert Einstein’s advice to his son on the secret to learning anything, Sherwood Anderson on the key to the creative life, Benjamin Rush on travel and life, Lincoln Steffens on the power of not-knowing, and some of history’s greatest motherly advice. While heavy on the love, O’Neill’s letter is also unflinchingly honest in its hard truths about life, success, and the key to personal fulfillment.

O’Neill doesn’t take long to cut to the idea that an education is something one claims, not something one gets. With stern sensitivity, he issues an admonition that would exasperate the archetypal millennial (that archetype being, of course, merely another limiting stereotype) and writes:

All I know is that if you want to get anywhere with it, or with anything else, you have got to adopt an entirely different attitude from the one you have had toward getting an education. In plain words, you’ve got to make up your mind to study whatever you undertake, and concentrate your mind on it, and really work at it. This isn’t wisdom. Any damned fool in the world knows it’s true, whether it’s a question of raising horses or writing plays. You simply have to face the prospect of starting at the bottom and spending years learning how to do it.

O’Neill’s son seems to suffer from Fairy Godmother Syndrome — the same pathology afflicting many young people today, from aspiring musicians clamoring to be on nationally televised talent competitions that would miraculously “make” their career to online creators nursing hopes of being “discovered” with a generous nod from an established internet goddess or god. O’Neill captures this in a beautiful lament:

The trouble with you, I think, is you are still too dependent on others. You expect too much from outside you and demand too little of yourself. You hope everything will be made smooth and easy for you by someone else. Well, it’s coming to the point where you are old enough, and have been around enough, to see that this will get you exactly nowhere. You will be what you make yourself and you have got to do that job absolutely alone and on your own, whether you’re in school or holding down a job.

O’Neill points to finding one’s purpose, and the inevitable work ethic it requires, as the surest way to attain fulfillment in life:

The best I can do is to try to encourage you to work hard at something you really want to do and have the ability to do. Because any fool knows that to work hard at something you want to accomplish is the only way to be happy. But beyond that it is entirely up to you. You’ve got to do for yourself all the seeking and finding concerned with what you want to do. Anyone but yourself is useless to you there.

What I am trying to get firmly planted in your mind is this: In the really important decisions of life, others cannot help you. No matter how much they would like to. You must rely on yourself. That is the fate of each one of us. It can’t be changed. It just is like that. And you are old enough to understand this now.

And that’s all of that. It isn’t much help in a practical advice way, but in another way it might be. At least, I hope so.

Toward the end of the letter, O’Neill makes a sidewise remark that might well be his most piercing and universally valuable piece of wisdom:

I’m glad to know of your doing so much reading and that you’re becoming interested in Shakespeare. If you really like and understand his work, you will have something no one can ever take from you.

http://www.brainpickings.org/2014/09/15/eugene-oneill-hard-work-letter-to-son/