You should actually blame America for everything you hate about internet culture

November 21

The tastes of American Internet-users are both well-known and much-derided: Cat videos. Personality quizzes. Lists of things that only people from your generation/alma mater/exact geographic area “understand.”

But in France, it turns out, even viral-content fiends are a bit more … sophistiqués.

“In France, articles about cats do not work,” Buzzfeed’s Scott Lamb told Le Figaro, a leading Parisian paper. Instead, he explained, Buzzfeed’s first year in the country has shown it that “the French love sharing news and politics on social networks – in short, pretty serious stuff.”

This is interesting for two reasons: first, as conclusive proof that the French are irredeemable snobs; second, as a crack in the glossy, understudied facade of what we commonly call “Internet culture.”

When the New York Times’s David Pogue tried to define the term in 2009, he ended up with a series of memes: the “Star Wars” kid, the dancing baby, rickrolling, the exploding whale. Likewise, if you look to anyone who claims to cover the Internet culture space — not only Buzzfeed, but Mashable, Gawker and, yeah, yours truly — their coverage frequently plays on what Lamb calls the “cute and positive” theme. They’re boys who work at Target and have swoopy hair, videos of babies acting like “tiny drunk adults,” hamsters eating burritos and birthday cakes.

That is the meaning we’ve assigned to “Internet culture,” itself an ambiguous term: It’s the fluff and the froth of the global Web.

But Lamb’s observations on Buzzfeed’s international growth would actually seem to suggest something different. Cat memes and other frivolities aren’t the work of an Internet culture. They’re the work of an American one.

American audiences love animals and “light content,” Lamb said, but readers in other countries have reacted differently. Germans were skeptical of the site’s feel-good frivolity, he said, and some Australians were outright “hostile.” Meanwhile, in France — land of la mode and le Michelin — critics immediately complained, right at Buzzfeed’s French launch, that the articles were too fluffy and poorly translated. Instead, Buzzfeed quickly found that readers were more likely to share articles about news, politics and regional identity, particularly in relation to the loved/hated Paris, than they were to share the site’s other fare.

A glance at Buzzfeed’s French page would appear to bear that out. Right now, its top stories “Ça fait le buzz” — that’s making the buzz, for you Americaines — are “21 photos that will make you laugh every time” and “26 images that will make you rethink your whole life.” They’re not making much buzz, though. Neither has earned more than 40,000 clicks — a pittance for the reigning king of virality, particularly in comparison to Buzzfeed’s versions on the English site.

All this goes to show that the things we term “Internet culture” are not necessarily born of the Internet, itself — the Internet is everywhere, but the insatiable thirst for cat videos is not. If you want to complain about dumb memes or clickbait or other apparent instances of socially sanctioned vapidity, blame America: We started it, not the Internet.

Appelons un chat un chat.

Caitlin Dewey runs The Intersect blog, writing about digital and Internet culture. Before joining the Post, she was an associate online editor at Kiplinger’s Personal Finance.
http://www.washingtonpost.com/news/the-intersect/wp/2014/11/21/you-should-actually-blame-america-for-everything-you-hate-about-internet-culture/

The pharmaceutical industry has flooded America with antipsychotics.


The Most Popular Drug in America Is an Antipsychotic and No One Really Knows How It Works

Does anyone remember Thorazine? It was an antipsychotic given to mentally ill people, often in institutions, that was so sedating, it gave rise to the term “Thorazine shuffle.” Ads for Thorazine in medical journals, before drugs were advertised directly to patients, showed Aunt Hattie in a hospital gown, zoned out but causing no trouble to herself or anyone else. No wonder Thorazine and related drugs Haldol, Mellaril and Stelazine were called chemical straitjackets.
But Thorazine and similar drugs became close to obsolete in 1993 when a second generation of antipsychotics which included Risperdal, Zyprexa, Seroquel, Geodon and Abilify came online. Called “atypical” antipsychotics, the drugs seemed to have fewer side effects than their predecessors like dry mouth, constipation and the stigmatizing and permanent facial tics known as TD or tardive dyskinesia. (In actuality, they were similar.) More importantly, the drugs were obscenely expensive: 100 tablets of Seroquel cost as much as $2,000, Zyprexa, $1,680 and Abilify $1,644.
One drug that is a close cousin of Thorazine, Abilify, is currently the top-selling of all prescription drugs in the U.S. marketed as a supplement to antidepressant drugs, reports the Daily Beast. Not only is it amazing that an antipsychotic is outselling all other drugs, no one even knows how it works to relieve depression, writes Jay Michaelson. The standardized United States Product Insert says Abilify’s method of action is “unknown” but it likely “balances” brain’s neurotransmitters. But critics say antipsychotics don’t treat anything at all, but zone people out and produce oblivion. They also say there is a concerning rise in the prescription of antipsychotics for routine complaints like insomnia.
They are right. With new names and prices and despite their unknown methods of action, Pharma marketers have devised ways to market drugs like Abilify to the whole population, not just people with severe mental illness. Only one percent of the population, after all, has schizophrenia and only 2.5 percent has bipolar disorder. Thanks to these marketing ploys, Risperdal was the seventh best-selling drug in the world until it went off patent and Abilify currently rules.
Here are some of the ways Big Pharma made antipsychotics everyday drugs.
Approval Creep
Everyone has heard of “mission creep.” In the pharmaceutical world, approval creep means getting the FDA to approve a drug for one thing and pushing a lot of other drug approvals through on the coattails of the first one. Though the atypical antipsychotics were originally drugs for schizophrenia, soon there was a dazzling array of new uses.
Seroquel was first approved in 1997 for schizophrenia but subsequently approved for bipolar disorder, psychiatric conditions in children and finally as an add-on drug for depression like Abilify. The depression “market” is so huge, Seroquel’s last approval allowed the former schizophrenia drug to make $5.3 billion a year before it went off patent. But before the add-on approval, AstraZeneca, which makes Seroquel, ran a sleazy campaign to convince depressed people they were really “bipolar.” Ads showed an enraged woman screaming into the phone, her face contorted, her teeth clenched. Is this you, asked the ads? Your depression may really be bipolar disorder, warned the ad.
Sometimes the indication creep is under the radar. After heated FDA hearings in 2009 about extending Zyprexa, Seroquel and Geodon uses for kids–Pfizer and AstraZeneca slides showed that kids died in clinical trials–the uses were added by the FDA but never announced. They were slipped into the record right before Christmas, when no news breaks, and recorded as “label changes.” Sneaky.
And there is another “creep” which is also under the radar: “warning creep.” As atypical antipsychotics have gone into wide use in the population, more risks have surfaced. Labels now warn against death-associated risks in the elderly, children and people with depression but you have to really read the fine print. (Atypical antipsychotics are so dangerous in the elderly with dementia, at least 15,000 die in nursing homes from them each year, charged FDA drug reviewer David Graham in congressional testimony.) The Seroquel label now warns against cardiovascular risks, which the FDA denied until the drug was almost off patent.
Dosing Children
Perhaps no drugs but ADHD medications have been so widely used and often abused in children as atypical antipsychotics. Atypical antipsychotics are known to “improve” behavior in problem children across a broad range of diagnoses but at a huge price: A National Institute of Mental Health study of 119 children ages 8 to 19 found Risperdal and Zyprexa caused such obesity a safety panel ordered the children off the drugs.
In only eight weeks, kids on Risperdal gained nine pounds and kids on Zyprexa gained 13 pounds. “Kids at school were making fun of me,” said one study participant who put on 35 pounds while taking Risperdal.
Just like the elderly in state care, poor children on Medicaid are tempting targets for Big Pharma and sleazy operators because they do not make their own medication decisions. In 2008, the state ofTexas charged Johnson & Johnson subsidiary Janssen with defrauding the state of millions with “a sophisticated and fraudulent marketing scheme,” to “secure a spot for the drug, Risperdal, on the state’s Medicaid preferred drug list and on controversial medical protocols that determine which drugs are given to adults and children in state custody.”
Many other states have brought legal action against Big Pharma including compelling drug makers to pay for the extreme side effects that develop with the drugs: massive weight gain, blood sugar changes leading to diabetes and cholesterol problems.
Add-On Conditions
It’s called polypharmacy and it is increasingly popular: Prescribing several drugs, often as a cocktail, that are supposed to do more than the drugs do alone. Big Pharma likes polypharmacy for two obvious reasons: drug sales are tripled or quadrupled—and it’s not possible to know if the drugs are working. The problems with polypharmacy parallel its “benefits.” The person can’t know which, if any, of the drugs are working so they take them all. By the time someone is on four or more psychiatric drugs, there is a good chance they are on a government program and we are paying. There is also a good chance the person is on the drugs for life, because withdrawal reactions make them think there really is something wrong with them and it is hard to quit the drugs.
Into this lucrative merchandising model came the idea of “add-on” medications and “treatment-resistant depression.” When someone’s antidepressant didn’t work, Pharma marketers began floating the idea that it wasn’t that the drugs didn’t work; it wasn’t that the person wasn’t depressed to begin with but had real life, job and family problems—it was “treatment-resistant depression.” The person needed to add a second or third drug to their antidepressant, such as Seroquel or Abilify. Ka-ching.
Lawsuits Don’t Stop Unethical Marketing
Just as Big Pharma has camped out in Medicare and Medicaid, living on our tax dollars while fleeing to England so it doesn’t have to pay taxes, Pharma has also camped out in the Department of Defense and Veterans Affairs. Arguably, no drugs have been as good for Big Pharma as atypical antipsychotics within the military. In 2009, the Pentagon spent $8.6 million on Seroquel and VA spent $125.4 million—almost $30 million more than is spent on a F/A-18 Hornet.
Risperdal was even bigger in the military. Over a period of nine years, VA spent $717 million on its generic, risperidone, to treat PTSD in troops in Afghanistan and Iraq. Yet not only was risperidone not approved for PTSD, it didn’t even work. A 2011 study in the Journal of the American Medical Association found the drug worked no better than placebo and the money was totally wasted.
In the last few years, the makers of Risperdal, Seroquel and Zyprexa have all settled suits claiming illegal or fraudulent marketing. A year ago, Johnson & Johnson admitted mismarketing Risperdal in a $2.2 billion settlement. But the penalty is nothing compared with the $24.2 billion it made from selling Risperdal between 2003 to 2010 and shareholders didn’t blink. The truth is, there is too much money in hawking atypical antipsychotics to the general population for Pharma to quit.

 

The Interregnum: Why the Future is so chaotic

The Interregnum:

Why the Future is so chaotic

“The old is dying,and the new cannot be born; in this interregnum there arises a diversity of morbid symptoms”-Antonio Gramsci

The morbid symptoms began to appear in the spring of 2003. The Department of Homeland Security was officially formed and despite the street protests of millions around the world, the United States invaded Iraq on the pretext of capturing Saddam’s “weapons of mass destruction”. By summer it was obvious that there were no such weapons and that we had been tricked into a war from which there was no easy exit. Pollsters began to notice that a majority of American’s felt we were “on the wrong track” and the distrust of our leadership has gotten worse every year.

So while the citizens exhibit historical levels of anger with the country’s drift, neither the political nor the economic leaders have put forth an alternative vision of our future. We are in an Interregnum: the often painful uprooting of old traditions and the hard-fought emergence of the new. The traditional notion of an interregnum refers to the time when a king died and a new king had not been coronated. But for our purposes, the notion of interregnum refers to those hinges in time when the old order is dead, but the new direction has not been determined. Quite often, the general populace does not understand that the transition is taking place and so a great deal of tumult arises as the birth pangs of a new social and political order. We are in such a time in America.

For those of us who work in the field of media and communications the signs of the Interregnum are everywhere. Internet services decimate the traditional businesses of music and journalism. For individual journalists or musicians, the old order is clearly dying, but a new way to make a living cannot seem to be birthed. Those who work in the fields of film and television can only hope a similar fate does not await their careers. In the world of politics a similar dynamic is destroying traditional political parties and the insurgent bottom up, networked campaigns pioneered by Barack Obama now become the standard. And yet we realize that for all it’s insurgency, the Obama campaign really did not usher in a new era. It is clear that there is an American Establishment that seems to stay in power no matter which party controls The White House. And the recent election only makes this more obvious. But this top-down establishment order is clearly dying, but it clings to it privileges and the networked, bottom-up society is not yet empowered.

Since 1953 when two senior partners of a Wall Street law firm, the brothers John Foster and Allen Dulles began running American foreign (and often domestic) policy, an establishment view, through Democratic and Republican presidencies alike, has been the norm. As Stephen Kinzer (in his book The Brothers)has written about the Dulles brothers, “Their life’s work was turning American money and power into global money and power. They deeply believed, or made themselves believe, that what benefited them and their clients would benefit everyone.” They created a world in which the Wall Street elites at first set our foreign policy and eventually (under Ronald Reagan) came to dominate domestic and tax policy — all to the benefit of themselves and their clients.

In 1969 the median salary for a male worker was $35,567 (in 2012 dollars). Today it is $33,904. So for 44 years, while wages for the top 10% have continued to climb, most Americans have been caught in a ”Great Stagnation”, bringing into question the whole purpose of the American capitalist economy. The notion that what benefited the establishment would benefit everyone, had been thoroughly discredited.

Seen through this lens, the savage partisanship of the current moment makes an odd kind of sense. What were the establishment priorities that moved inexorably forward in both Republican and Democratic administrations? The first was a robust and aggressive foreign policy. As Kinzer writes of the Dulles brothers, “Exceptionalism — the view that the United States has a right to impose its will because it knows more, sees farther, and lives on a higher moral plane than other nations — was to them not a platitude, but the organizing principle of daily life and global politics.” From Eisenhower to Obama, this principle has been the guiding light of our foreign policy, bringing with it annual defense expenditures that dwarf those of all the world’s major powers combined and drive us deeper in debt. The second principle of the establishment was, “what is good for Wall Street is good for America.” Despite Democrats efforts to paint the GOP as the party of Wall Street, one would only have to look at the efforts of Clinton’s Treasury secretaries Rubin and Summers to kill the Glass-Steagal Act and deregulate the big banks, to see that the establishment rules no matter who is in power. Was it any surprise that Obama then appointed the architects of bank deregulation, Summers and Geithner, to clean up the mess their policies had caused?

So when we observe politicians as diverse as Elizabeth Warren and Rand Paul railing against the twin poles of establishment orthodoxy, can we really be surprised? Is there not a new consensus that the era of America as global policeman is over? Is there not agreement from the Tea Party to Occupy Wall Street that the domination of domestic policy by financial elites is over? But here is our Interregnum dilemma. It is one thing to forecast a kind of liberal-libertarian coalition around the issues of defense spending, corporate welfare and even the privacy rights of citizens in a national security state. It is a much more intractable problem to find consensus on the causes and cures of the Great Stagnation. It does seem like we need to understand the nature of the current stagnation by looking back to the late sixties when the economy was very different than it is today. In 1966, net investment as a percentage of GDP peaked at 14% and it has been on a steady decline ever since, despite the computer revolution which was only getting started in the early 1970’s.

Economic growth only comes from three sources: consumption, investment or foreign earnings from trade (the Current Account). We have been living so long with a negative current account balance and falling investment that economic growth is almost totally dependent on the third leg of the stool, consumer spending. But with the average worker unable to get a raise since 1969, consumption can only come from loosened credit standards. As long as the average family could use their home equity as an ATM, the party could continue, driven by the increasing sophistication of advertising and “branded entertainment” to induce mall fever to a strapped consumer. And by the late 1990’s consumer preferences began to drive a winner take all digital economy where one to three firms dominated each sector: Apple and Google; Verizon and AT&T, Comcast and Time Warner Cable; Disney, Fox, Viacom and NBC Universal; Facebook and Twitter. All of this was unloosed by the establishment meme of deregulation — a world in which anti-trust regulators had little influence and laissez-faire ruled. These oligopolies began making so much money they didn’t have enough places to invest so corporate cash as a percentage of assets rose to an all time high.

Here is my fear. That our current version of capitalism is not working. Apple holds on to $158 billion in cash because it can’t find a profitable investment. And because U.S. worker participation rates are only 64%, a huge number of people can never afford an I Phone and so domestic demand is flat (though very profitable) and the real growth in the digital economy will be in Asia, Africa and South America. There is not much the Fed lowering interest rates can do to alter this picture. What is needed is not more easy money loans; it more decent jobs.

But unlike our left-right consensus on military spending, there is a fierce debate raging between economists about the causes and solutions to this stagnation. Though both left and right agree the economy has stagnated, there are huge differences in the prospects for emerging from this condition. On the right, the political economist Tyler Cowen’s new book is called Average is Over: Powering America Beyond the Age of the Great Stagnation. Here is how Cowen sees the next twenty years.

The rise of intelligent machines will spawn new ideologies along with the new economy it is creating. Think of it as a kind of digital social Darwinism, with clear winners and losers: Those with the talent and skills to work seamlessly with technology and compete in the global marketplace are increasingly rewarded, while those whose jobs can just as easily be done by foreigners, robots or a few thousand lines of code suffer accordingly. This split is already evident in the data: The median male salary in the United States was higher in 1969 than it is today. Middle-class manufacturing jobs have been going away due to a mix of automation and trade, and they are not being replaced. The most lucrative college majors are in the technical fields, such as engineering. The winners are doing much better than ever before, but many others are standing still or even seeing wage declines.

On the left, Paul Krugman is not so sure we can emerge from this stagnation.

But what if the world we’ve been living in for the past five years is the new normal? What if depression-like conditions are on track to persist, not for another year or two, but for decades?…In fact, the case for “secular stagnation” — a persistent state in which a depressed economy is the norm, with episodes of full employment few and far between — was made forcefully recently at the most ultrarespectable of venues, the I.M.F.’s big annual research conference. And the person making that case was none other than Larry Summers. Yes, that Larry Summers.

Cowen forecasts a dystopian world where 10% of the population do very well and “the rest of the country will have stagnant or maybe even falling wages in dollar terms, but they will also have a lot more opportunities for cheap fun and cheap education.” That’s real comforting. He predicts the 90% will put up with this inequality for two reasons. First, the country is aging: “remember that riots and protests are typically the endeavors of young hotheads, not sage (or tired) senior citizens.” And second, because of the proliferation of social networks, “envy is local…Right now, the biggest medium for envy in the United States is probably Facebook, not the big yachts or other trophies of the rich and famous.”

Although Cowen cites statistics about the fall in street crime to back up the notion that the majority of citizens are passively accepting gross inequality, I think he completely misunderstands the nature of anti-social pathologies in the Internet Age of Stagnation. Take the example of the Web Site Silk Road.

Silk Road already stands as a tabloid monument to old-fashioned vice and new-fashioned technology. Until the website was shut down last month, it was the place to score, say, a brick of cocaine with a few anonymous strokes on a computer keyboard. According to the authorities, it greased $1.2 billion in drug deals and other crimes, including murder for hire.

From Lulzsec to Pirate Bay to Silk Road, the coming anarchy of a Bladerunner like society are far more vicious than a few street thugs in our major cities. The rise of virtual currencies that can’t be traced like Bitcoin only make the possibilities for a huge crime wave on the Dark Net more imminent—one which IBM estimates already costs the economy $400 billion annually.

So while both Cowen and Krugman agree that stagnation is causing the labor force participation rate to fall, they disagree as to whether anything can be done to remedy the problem.

In the early 1970’s the participation rate began to climb as more and more women entered the workforce. It peaked when George Bush entered office and has been on the decline ever since. As the Time’s David Leonhardt has pointed out, this has very little to do with Baby Boomer retirement. The economist Daniel Alpert has argued in his new book, The Age of Oversupply, that “the central challenge facing the global economy is an oversupply of labor, productive capacity and capital relative to the demand for all three.”

Viewed through this lens, neither the policy prescriptions of Republicans nor Democrats are capable of changing the dynamic brought about by the entrance of three billion new workers into the global economy in the last 20 years. Republican fears that U.S. deficits will lead to Weimar-like hyper-inflation ring hollow in a country where only 63% of the able bodied are working. Democrats hectoring for The Fed and the banks to loan more to business to stimulate the economy are equally nonsensical when American corporations are sitting on $2.4 trillion in cash.

But there is a way out of this deflationary trap we are in. First the Republicans have got to acknowledge the obvious: America’s corporations are not going to invest in vast amounts of new capacity when there is a glut in almost every sector worldwide. Secondly, that overcapacity is not going to get absorbed until more people go back to work and start buying the goods from the factories. This was the same problem our country faced in the great depression and the way we got out of it was by putting people to work rebuilding the infrastructure of this country. Did it ever occur to the politicians in Washington that the reason so many bridges, water and electrical systems are failing is because most of them were built 80 years ago, during the great depression? For Republicans to insist that more austerity will bring back the “confidence fairy”is exactly the wrong policy prescription for an age of oversupply. But equally destructive, as Paul Krugman points out are Democratic voices like Erskine Bowles, shouting from any venue that will pay him, that the debt apocalypse is upon us.

But the Democrats are also going to have to give up some long held beliefs that all good solutions come from Washington. If the Healthcare.gov website debacle has taught us anything, it is that devolving power from Washington to the states is the answer to the complexity of modern governance. While California’s healthcare website performed admirably, the notion of trying to create a centralized system to service 50 different state systems was a fool’s errand. So what is needed is a federalist solution for investment in the infrastructure of the next economy. This is the way out of The Interregnum. Investors buying tax-free municipal bonds to rebuild ancient water systems and bridges as well as solar and wind plants will finance much of it. But just as President Eisenhower understood that a national interstate highway system built in the 1950’s would lead to huge productivity gains in the 1960’s and 1970’s, Federal tax dollars will have to play a large part in rebuilding America. As we wind down our trillion dollar commitments to wars in the Middle East, we must engage in an Economic Conversion Strategy from permanent war to peaceful innovation that both liberals and libertarians could embrace.

The way to overcome the partisan gridlock on infrastructure spending would be for Obama to commit to a totally federalist solution to us getting out of our problems. The Federal Government would use every dollar saved from getting out of Iraq, Afghanistan and all the other defense commitments in block innovation grants to the states. Lets say the first grant is for $100 Billion. It will be given directly to the states on a per capita basis to be used to foster local economic growth. No strings or Federal Bureaucracy attached to the grants except that the states have to publish a yearly accounting of the money in an easily readable form. And then let the press follow the money and see which states come up with the most imaginative solutions. Some states might use the grants to lower the cost of state university tuition. Others might spend the money on high-speed rail lines or municipal fiber broadband and wifi. As we have found in the corporate sector, pushing power to the edges of an organization helps foster innovation. As former IBM CEO Sam Palmisano told his colleagues, “we have to lower the center of gravity of this organization”.

If it worked, then slowly more money could be transferred to the states in these bureaucracy free block grants. Gradually the bureaucracies of the Federal government would shrink as more and more responsibility was shifted to local supervision of education, health, welfare and infrastructure.

In the midst of our current Washington quagmire this vision of a growing American middle class may seem like a distant mirage. But it is clear that the establishment consensus on foreign policy, defense spending, domestic spying and corporate welfare has died in the last 12 months. The old top-down establishment order is clearly dying, but just how we build the new order based on a bottom-up, networked society that works for the 90%, not just the establishment is the question of our age.

“There is but one way out for you”: Read the uncensored letter J. Edgar Hoover wrote to MLK

Historian discovers unredacted copy of longtime FBI chief’s chilling letter

"There is but one way out for you": Read the uncensored letter J. Edgar Hoover wrote to MLK

Martin Luther King, Jr., J. Edgar G. Hoover (Credit: AP)

The mutual contempt between civil rights icon Dr. Martin Luther King, Jr. and longtime Federal Bureau of Investigation chief J. Edgar Hoover was hardly a well-kept secret. It was 50 years ago this month that Hoover denounced King as “the most notorious liar in the country” after King publicly took the bureau to task for its woefully inadequate enforcement of civil rights protections. In the years since, historians have documented the FBI’s smear campaign against King, which primarily consisted of wiretapping the activist and digging up dirt on his sexual rendezvous. Perhaps the most chilling piece of evidence uncovered in investigating the FBI’s crusade against King was a threatening 1964 letter — confirmed by U.S. Senator Frank Church’s investigative committee as Hoover’s handiwork — in which Hoover, posing as a disillusioned black supporter, warned that King’s “countless acts of adultery and immoral conduct” would be exposed. For the first time, that letter is available in uncensored form.

 Previous versions of the letter redacted details about King’s sexual liaisons, but while conducting research for a biography of Hoover this summer, Yale University historian Beverly Gage happened upon an uncensored version “tucked away in a reprocessed set of his official and confidential files at the National Archives,” she writes in the forthcoming New York Times Magazine.

Containing no fewer than six uses of the word “evil,” the letter assails King as a fraud and appears to have been sent along with a wiretapped recording of the civil rights activist engaged in an extramarital encounter.

“Lend your sexually psychotic ear to the enclosure,” Hoover writes in one passage.

“You know you are a complete fraud and a great liability to all of us Negroes,” the letter reads. “You are a colossal fraud and an evil, vicious one at that,” Hoover later adds.

Hoover vows that King will soon be “exposed on the record for all time.”

“Yes, from your various evil playmates on the east coast to [here an individual’s name is redacted because Hoover’s allegations about her have not been confirmed or debunked] and others on the west coast and outside the country you are on the record. King you are done,” Hoover declares.



The letter concludes with a menacing declaration that “there is only one thing left for you to do,” giving King a deadline of 34 days before he would be exposed.

“You are done,” Hoover writes. “There is but one way out for you. You better take it before your filthy, abnormal, fraudulent self is bared to the nation.”

As Gage notes, King told associates that he was convinced that someone — likely Hoover — was trying to provoke him to commit suicide.

“One oddity of Hoover’s campaign against King is that it mostly flopped, and the F.B.I. never succeeded in seriously damaging King’s public image,” Gage writes. “Half a century later, we look upon King as a model of moral courage and human dignity. Hoover, by contrast, has become almost universally reviled. In this context, perhaps the most surprising aspect of their story is not what the F.B.I. attempted, but what it failed to do.”

Read the letter below, via Gawker:

Luke Brinker is Salon’s deputy politics editor. Follow him on Twitter at @LukeBrinker.

Slashed and Hidden from Sight: The Strange Power of Cursed Paintings

Edwin Landseer, "Man Proposes, God Disposes" (1864), oil painting (via Wikimedia)

Can a painting drive a person to madness? While there is no doubt staring at something like Goya’s unnerving Black Paintings for hours might be destabilizing, the powers of derangement in art are mostly superstition. Yet at the University of London’s Royal Holloway, one painting is regularly draped in a Union Jack flag due to an old fear that its gruesome visuals could snap the sanity from a student’s brain.

Edwin Landseer’s 1864 “Man Proposes, God Disposes” has creeped people out since its debut with its dual polar bears scavenging at the wreckage of the ill-fated Franklin expedition to the Northwest Passage. One creature has a human rib bone rapturously clenched in its fangs, the other lunges at a scrap of fabric drenched in a blood-red color. William Michael Rossetti mourned it the “saddest of membra disjecta.” The widowed Lady Franklin was not surprisingly dismayed, and some even asked if Landseer, known for his noble dogs, was getting a bit unhinged.

College Curator Laura MacCulloch explains: “No one quite knows when the tradition of covering the picture first began but according to an article published in 1984 it seems to have started in the 1970s when a rumour was spread that a student who looked directly at the painting during an exam, went mad and committed suicide.” That student reportedly scrawled “the polar bears made me do it” on their incomplete exam, although there’s no evidence this is more than urban legend. A replica just went on view in Calgary in the Glenbow Museum’s Vanishing Ice: Alpine and Polar Landscapes in Art, reviving the sinister tale.

Slashes from Abram Balashov's 1913 attack on "Ivan the Terrible and His Son Ivan" (1885) by Ilya Repin (via Wikimedia)

Supposedly cursed artifacts and art are in almost every museum, from a cursed amethyst held by the Natural History Museum in London to a cursed meteorite at the Field Museum in Chicago. Myths of madness often swirl around radical art, as during an 1874 Impressionism show, one visitor was reported to have raged out and bit people on the street.

There’s also the curious case of a painting stabbed in 1913. Abram Balashov slashed the grisly “Ivan the Terrible and His Son Ivan” (1885) by Ilya Repin three times, screaming “Stop the bloodshed!” before he was hauled away to a mental institution. Likely Balashov was already unstable before gazing into the horrible blood-shot eyes of Ivan, but it was reportedly just the most extreme of a series of violent responses.

Both the Repin and Landseer paintings weren’t just brutal images, they also attacked the status quo of their respective countries. Repin depicted vividly royal bloodshed, Landseer exposed the failure of infallible Victorian England. Reports of the cannibalism resorted to by the Franklin expedition riled the country with denial, and the total disappearance of the two ships haunted the following decades of exploration (one of the boats was finally found just this year). Perhaps it’s this gory evocation of total defeat that got the superstition started, an unsettling a message as anything for college students at exams.

Hanging out with the disgruntled guys who babysit our aging nuclear missiles—and hate every second of it.

Death Wears Bunny Slippers

Illustration by Tavis Coburn

Illustration by Tavis Coburn

Along a lonely state highway on central Montana’s high plains, I approach what looks like a ranch entrance, complete with cattle guard. “The first ace in the hole,” reads a hand-etched cedar plank hanging from tall wooden posts. “In continuous operation for over 50 years.” I drive up the dirt road to a building surrounded by video cameras and a 10-foot-tall, barbed-wire-topped fence stenciled with a poker spade. “It is unlawful to enter this area,” notes a sign on the fence, whose small print cites the Subversive Activities Control Act of 1950, a law that once required communist organizations to register with the federal government. “Use of deadly force authorized.”

I’m snapping photos when a young airman appears. “You’re not taking pictures, are you?” he asks nervously.

“Yeah, I am,” I say. “The signs don’t say that I can’t.”

“Well, we might have to confiscate your phone.”

Maybe he should. We’re steps away from the 10th Missile Squadron Alpha Missile Alert Facility, an underground bunker capable of launching several dozen nuclear-tipped Minuteman III intercontinental ballistic missiles (ICBMs), with a combined destructive force 1,000 times that of the Hiroshima bomb.

Another airman comes out of the ranch house and asks for my driver’s license. He’s followed by an older guy clad in sneakers, maroon gym shorts, and an air of authority. “I’m not here to cause trouble,” I say, picturing myself in a brig somewhere.

“Just you being here taking photos is causing trouble,” he snaps.

An alarm starts blaring from inside the building. One airman turns to the other. “Hey, there’s something going off in there.”
Six hours earlier, I was driving through Great Falls with a former captain in the Air Force’s 341st Missile Wing. Aaron, as I’ll call him, had recently completed a four-year stint at the Alpha facility. Had President Obama ordered an attack with ICBMs, Aaron could have received a coded message, authenticated it, and been expected to turn a launch key.

Also read: “That Time We Almost Nuked North Carolina“—a timeline of near-misses, mishaps, and scandals from our atomic arsenal.

We kept passing unmarked blue pickup trucks with large tool chests—missile maintenance guys. The Air Force doesn’t like to draw attention to the 150 silos dotting the surrounding countryside, and neither does Great Falls. With about 4,000 residents and civilian workers and a $219 million annual payroll, Malmstrom Air Force Base drives the local economy, but you won’t see any missile-themed bars or restaurants. “We get some people that have no idea that there’s even an Air Force base here,” one active-duty missileer told me.

It’s not just Great Falls practicing selective amnesia. The days of duck-and-cover drills, fallout shelters, and No Nukes protests are fading memories—nowhere more so than in the defense establishment. At a July 2013 forum in Washington, DC, Lt. General James Kowalski, who commands all of the Air Force’s nuclear weapons, said a Russian nuclear attack on the United States was such “a remote possibility” that it was “hardly worth discussing.”

But then Kowalski sounded a disconcerting note that has a growing number of nuclear experts worried. The real nuclear threat for America today, he said, “is an accident. The greatest risk to my force is doing something stupid.”

Lt. General James Kowalski

Lt. General James Kowalski Air Force

“You can’t screw up once—and that’s the unique danger of these machines,” points out investigative journalist Eric Schlosser, whose recent book, Command and Control, details the Air Force’s stunning secret history of nuclear near-misses, from the accidental release of a hydrogen bomb that would have devastated North Carolina to a Carter-era computer glitch that falsely indicated a shower of incoming Soviet nukes. “In this business, you need a perfect safety record.”

Once the military’s crown jewels, ICBM bases have become “little orphanages that get scraps for dinner.”

And a perfect record, in a homeland arsenal made up of hundreds of missiles and countless electronic and mechanical systems that have to operate flawlessly—to say nothing of the men and women at the controls—is a very hard thing to achieve. Especially when the rest of the nation seems to have forgotten about the whole thing. “The Air Force has not kept its ICBMs manned or maintained properly,” says Bruce Blair, a former missileer and cofounder of the anti-nuclear group Global Zero. Nuclear bases that were once the military’s crown jewels are now “little orphanages that get scraps for dinner,” he says. And morale is abysmal.

Blair’s organization wants to eliminate nukes, but he argues that while we still have them, it’s imperative that we invest in maintenance, training, and personnel to avoid catastrophe: An accident resulting from human error, he says, may be actually more likely today because the weapons are so unlikely to be used. Without the urgent sense of purpose the Cold War provided, the young men (and a handful of women) who work with the world’s most dangerous weapons are left logging their 24-hour shifts under subpar conditions—with all the dangers that follow.

In August 2013, Air Force commanders investigated two officers in the ICBM program suspected of using ecstasy and amphetamines. A search of the officers’ phones revealed more trouble: They and other missileers were sharing answers for the required monthly exams that test their knowledge of things like security procedures and the proper handling of classified launch codes. Ultimately, 98 missileers were implicated for cheating or failure to report it. Nine officers were stripped of their commands, and Colonel Robert Stanley, the commander of Malmstrom’s missile wing, resigned.

The Air Force claimed the cheating only went as far back as November 2011. Ex-missileers told me it went back decades: “Everybody has cheated on those tests.”

The Air Force claimed the cheating only went as far back as November 2011, but three former missileers told me it was the norm at Malmstrom when they arrived there back in 2007, and that the practice was well established. (Blair told me that cheating was even common when he served at Malmstrom in the mid-1970s.) Missileers would check each other’s tests before turning them in and share codes indicating the correct proportion of multiple-choice answers on a given exam. If the nuclear program’s top brass, who all began their careers as missileers, weren’t aware of it, the men suggested, then they were willfully looking the other way. “You know in Casablanca, when that inspector was ‘absolutely shocked’ that there was gambling at Rick’s? It’s that,” one recently retired missileer told me. “Everybody has cheated on those tests.”

Cheating is just one symptom of what Lt. Colonel Jay Folds, then the commander of the nuclear missile wing at North Dakota’s Minot Air Force Base, called “rot” in the atomic force. Last November, Associated Press reporter Robert Burns obtained a RAND study commissioned by the Air Force. It concluded that the typical launch officer was exhausted, cynical, and distracted on the job. ICBM airmen also had high rates of sexual assault, suicide, and spousal and child abuse, and more than double the rates of courts-martial than Air Force personnel as a whole.

The morale problems were well known to Michael Carey, the two-star general who led the program at the time the cheating was revealed. Indeed, he pointed them out to other Americans during an official military cooperation trip to Moscow, before spending the rest of his three-day visit on a drunken bender, repeatedly insulting his Russian military hosts and partying into the wee hours with “suspect” foreign women, according to the Air Force’s inspector general. He later confessed to chatting for most of a night with the hotel’s cigar sales lady, who was asking questions “about physics and optics”—and thinking to himself: “Dude, this doesn’t normally happen.” Carey was stripped of his command in October 2013.

The embarrassments just keep coming. Last week, the Air Force fired two more nuclear commanders, including Col. Carl Jones, the No. 2 officer in the 90th Missile Wing at Wyoming’s Warren Air Force Base, and disciplined a third, for a variety of leadership failures, including the maltreatment of subordinates. In one instance, two missileers were sent to the hospital after exposure to noxious fumes at a control center—they had remained on duty for fear of retaliation by their commander, Lt. Col. Jimmy “Keith” Brown. This week, the Pentagon is expected to release a comprehensive review of the nuclear program that details “serious problems that must be addressed urgently.”

“Their buddies from the B-52s and B-2s tell them all sorts of exciting stories about doing real things in Afghanistan and Iraq. They end up feeling superfluous.”

Stung by the recent bad press, the Air Force has announced pay raises, changes to the proficiency tests, and nearly $400 million in additional spending to increase staffing and update equipment. In the long term, Congress and the administration are debating a trillion-dollar suite of upgrades to the nuclear program, which could include replacing the existing ICBMs and warheads with higher-tech versions.

But outside experts say none of the changes will address the core of the problem: obsolescence. “There is a morale issue,” says Hans Kristensen, who directs the Federation of American Scientists’ Nuclear Information Project, “that comes down to the fundamental question: How is the ICBM force essential? It’s hard to find that [answer] if you sit in the hole out there. Their buddies from the B-52s and B-2s tell them all sorts of exciting stories about doing real things in Afghanistan and Iraq. They end up feeling superfluous.”

launch switches

A missile commander’s launch switches. National Park Service

Indeed, on my first night in town, over beer and bison burgers, Aaron had introduced me to “Brent,” another recently former missileer who looks more like a surfer now that his military crew cut is all grown out. Brent lost faith in his leaders early on, he told me, when he saw the way they tolerated, if not encouraged, a culture of cheating. He’d resisted the impulse, he said, and his imperfect test scores disqualified him for promotions. But the worst part of the gig, the guys agreed, might be the stultifying tedium of being stuck in a tiny room all day and night waiting for an order you knew would never come. “Any TV marathon you can stumble upon is good,” Brent said. “Even if it’s something you hate. It’s just that ability to zone out and lose time.”

 

CONTINUED:  http://www.motherjones.com/politics/2014/11/air-force-missile-wing-minuteman-iii-nuclear-weapons-burnout

Why “Psychological Androgyny” Is Essential for Creativity

by

“Creative individuals are more likely to have not only the strengths of their own gender but those of the other one, too.”

Despite the immense canon of research on creativity — including its four stages, the cognitive science of the ideal creative routine, the role of memory, and the relationship between creativity and mental illness — very little has focused on one of life’s few givens that equally few of us can escape: gender and the genderedness of the mind.

In Creativity: The Psychology of Discovery and Invention (public library) — one of the most important, insightful, and influential books on creativity ever written — pioneering psychologist Mihaly Csikszentmihalyi examines a curious, under-appreciated yet crucial aspect of the creative mindset: a predisposition to psychological androgyny.

In all cultures, men are brought up to be “masculine” and to disregard and repress those aspects of their temperament that the culture regards as “feminine,” whereas women are expected to do the opposite. Creative individuals to a certain extent escape this rigid gender role stereotyping. When tests of masculinity/femininity are given to young people, over and over one finds that creative and talented girls are more dominant and tough than other girls, and creative boys are more sensitive and less aggressive than their male peers.

Illustration by Yang Liu from ‘Man Meets Woman,’ a pictogram critique of gender stereotypes. Click image for details.

Csikszentmihalyi points out that this psychological tendency toward androgyny shouldn’t be confused with homosexuality — it deals not with sexual constitution but with a set of psychoemotional capacities:

Psychological androgyny is a much wider concept, referring to a person’s ability to be at the same time aggressive and nurturant, sensitive and rigid, dominant and submissive, regardless of gender. A psychologically androgynous person in effect doubles his or her repertoire of responses and can interact with the world in terms of a much richer and varied spectrum of opportunities. It is not surprising that creative individuals are more likely to have not only the strengths of their own gender but those of the other one, too.

Citing his team’s extensive interviews with 91 individuals who scored high on creativity in various fields — including pioneering astronomer Vera Rubin, legendary sociobiologist E.O. Wilson, philosopher and marginalia champion Mortimer Adler, universe-disturber Madeleine L’Engle, social science titan John Gardner, poet extraordinaire Denise Levertov, and MacArthur genius Stephen Jay Gould — Csikszentmihalyi writes:

It was obvious that the women artists and scientists tended to be much more assertive, self-confident, and openly aggressive than women are generally brought up to be in our society. Perhaps the most noticeable evidence for the “femininity” of the men in the sample was their great preoccupation with their family and their sensitivity to subtle aspects of the environment that other men are inclined to dismiss as unimportant. But despite having these traits that are not usual to their gender, they retained the usual gender-specific traits as well.

Illustration from the 1970 satirical book ‘I’m Glad I’m a Boy! I’m Glad I’m a Girl!’ Click image for more.

Creativity: The Psychology of Discovery and Invention is a revelatory read in its entirety, featuring insights on the ideal conditions for the creative process, the key characteristics of the innovative mindset, how aging influences creativity, and invaluable advice to the young from Csikszentmihalyi’s roster of 91 creative luminaries. Complement this particular excerpt with Ursula K. Le Guin on being a man — arguably the most brilliant meditation on gender ever written, by one of the most exuberantly creative minds of our time.

http://www.brainpickings.org/2014/11/07/psychological-androginy-creativity-csikszentmihalyi/

Follow

Get every new post delivered to your Inbox.

Join 1,618 other followers