US Millennials face higher unemployment, lower income than parents’ generation

download

By Shelley Connor
16 January 2017

A report released by Young Invincibles last week outlines key areas in which so-called Millennials—Americans between the ages of 18 and 34—face unprecedented financial difficulties.

The brief, which compares the financial health of Millennials to that of Baby Boomers in the 1980s, demonstrates that wages and home ownership have declined significantly within a generation. The authors measured five factors of Millennials’ financial health against that of young adults in the 1980s: income, assets, net wealth, home ownership and retirement planning.

The discrepancies in income alone are shocking; wages have declined by 20 percent from 1989 to the present, with Millennials earning about $10,000 less than Baby Boomers did as young adults. In 1989, a high school graduate earned about the same income as a college graduate with a degree today. The report also notes that an astounding 1 million young adults experienced long-term unemployment during the Great Recession of 2008-09.

The report’s authors maintain that, although income declined across all education levels for Millennials, a college degree remains a worthwhile investment. According to the Young Invincibles’ analysis, “intergenerational declines in income were steepest for those with no degree.” Nevertheless, years of deep cuts to state education budgets force today’s college students to contend with ever-rising tuition costs and increasing amounts of student loan debt.

The Young Invicibles’ report acknowledges that “student debt blunts some of education’s benefits,” which stands out as an impossibly sanguine understatement in light of the numbers they present. By their own analysis, median assets declined at a rate of 71 percent for college graduates with student debt, in contrast to a decline of 45 percent for college graduates without student loans.

Student debt is at an all-time high, with 42 percent of all 18-29 year olds reporting that they bear student loan debt. In addition, the average debt burden for students has nearly doubled within a single generation, with Millennials owing an average of $37,000 upon graduation.

In years past, a college degree was regarded as an important aspect of preparing for a career and gaining enough wealth to own a home and retire comfortably. The economic burdens of today’s college graduates, however, demonstrate that the economic downturn has cut deeply throughout all educational levels for working and lower-middle class youth.

When Baby Boomers graduated college, those with outstanding student loans earned an average of $68,000 annually. Student borrowers today, by contrast, can expect to earn an average of $51,000—a 25 percent decrease.

Another cornerstone of financial security for Americans, home ownership, has declined by about eight percent between the Baby Boom and the Millennial generations. When separating out those without college degrees, however, the decline is a much steeper 22 percent. College graduates with student loan debt are also less likely to own their homes.

Only half of today’s young adults own their own homes, according to Young Invincibles, and the authors point to studies that show an estimated 2.8 million 25- to 34-year-olds contend with severe rental burdens.

Housing accounts for over 60 percent of assets held by the middle class; it represents about 15 percent of gross domestic product. Given that fewer than half of today’s young adults can attain home ownership, while many others cannot afford to rent, the implications for the economy as a whole are sobering.

This is a particularly strong indicator of the depth of economic decline. Last year, a study by the Pew Research Center revealed that, for the first time since 1880, young adults between the ages of 18 and 34 were more likely to live with a parent than in any other living arrangement. Pew’s researchers pointed to an anemic job market, where 5.7 percent of men between ages 25 and 34 are unemployed. On top of this, rental costs have risen disproportionately to wages since 2008.

Housing is not the only area in which Millennials lag behind Baby Boomers. Young adults in the 1980s owned twice the amount of assets as young adults in 2013. Research highlights the impact of student debt on this decline; non-borrowers amongst this cohort own over three times the assets of borrowers. In 1989 college graduates with student debt enjoyed a median net wealth of $86,500; by 2014 the same cohort had a median net worth of only $6,600.

On its face, retirement seems to be the one area in which Millennials are on stronger footing than Baby Boomers; retirement plan ownership increased by 150 percent between 1989 and today. However, beneath the surface of this seemingly hopeful number lies the virtual disappearance of pension plans, which decreased from 27.1 million in 1989 to 15.2 million in 2013.

The Young Invincibles’ report was notably released amidst a storm of pageantry surrounding President Barack Obama’s exit from the White House. The New York Times, which pumps out wholesale lies and half-truths on a daily basis, published an editorial in its Sunday edition praising Obama’s “optimism.”

The Times heralded Obama’s ascension to the White House as a surprising victory over racism and greed and compares him to Abraham Lincoln, insinuating that he has made America a more equitable and prosperous country.

The Times also praised Obama’s stimulus plan, which they assert staved off another Great Depression, and hailed the federal investment in General Motors and Chrysler. This move, they claim, preserved more than a million jobs. They casually ignore the fact that the investment was predicated upon stripping autoworkers of their hard-earned pensions and dramatically cutting wages. Autoworkers today are forced to work grueling hours and face hazardous working conditions for poverty wages.

“Even now,” the Times’ chides, “…stubborn biases and beliefs… have blinded many Americans to their own good fortune, fortune that flowed from policies set in motion by this president.” The startling numbers quoted by the Young Invincibles—declining home ownership, disappearing pensions, rent that outstrips earning, and crippling student debt—give the lie to this offensive statement.

http://www.wsws.org/en/articles/2017/01/16/mill-j16.html

Bad times make great art?

 Worlds of light and shadow: The reproduction of liberalism in Weimar Germany

The claim that good art comes from hard times is the height of delusionally entitled thinking

Bad times make great art. Worlds of light and shadow: The reproduction of liberalism in Weimar Germany

Fritz Lang’s “Metropolis” (1927) (Credit: Kino International)

On election night a murmur started just as the last gasp faded, “Well at least we can expect some great art.” At first the sentiment was a fatalistic one-off, a brave face, a shy hope that something good would come from the dark days forecast for the Trump presidency. It didn’t take long for the statement to acquire a predictive tone, eventually a waft of desperation was detectable and, ultimately, shrill fiat.

The art of protest is provocative, no question. It’s often brave, usually fierce, sometimes compelling and occasionally inspirational. But is the appeal of the books, films, poetry, painting, television and sculpture produced in response to tyranny, oligarchic pomposity or a fetishistic prioritization of the bottom line universal or simply reactive? How durable is the art birthed from protest? The following essay is the second in a series for Salon exploring the question Do bad times really inspire great art?

On Nov. 6 of this year, just two days before the presidential election, aging American punks Green Day took the stage at the MTV Europe Music Awards to perform their 2004, Bush II-era modern pop-punk staple, “American Idiot.”

Singer Billie Joe Armstrong snarled in the vague direction of then-presidential hopeful, now president-elect Donald J. Trump, asking the audience of largely Dutch citizens possessing close to zero influence on the American political conversation, “Can you hear the sound of hysteria? The subliminal mind-Trump America.”

Apart from the lyrics not making a lot of sense, it also had no effect whatsoever on the outcome of the election. However well-meaning, Armstrong and Co. would have been just as effective by writing “DO NOT VOTE FOR DONALD TRUMP” on a piece of paper, cramming it in a bottle, and chucking it into the ocean, or by whispering “Trump is bad” into a hole.

The clear lesson: punk is dead. And not only that, but it’s been poisoned, drowned, hanged, beaten, stabbed, killed, re-killed and killed again, like some slobbering Rasputin-ish zombie. So when people claim, desperately, that Trump’s America will somehow lead to a resurgence in angry, politically charged guitar music, it’s all I can do to keep my eyes from rolling out of my head.

* * *

To claim that good art — that is: stuff of considerable aesthetic merit, which is maybe even socially advantageous — comes from hard times is the height of delusionally entitled thinking, as if mass deportations and radicalized violence are all in the service of a piece of music. Of course, even the idea of what qualifies as “good times” must be qualified. Given that Trump won the election, it stands to reason that for a majority of Americans (or at least for a majority of electoral college representatives) the prospect of a Trump presidency is a beneficial thing, which will usher in a new epoch of prosperity and big-league American greatness.

There may be truth, or at least the ring of truth, in the idea that objects of artistic value can be produced under the pressure of hardship. While it may be true that an artist like, say, the late Leonard Cohen was able to mine the fathomless quarries of heartache and longing for his music and poetry, it is also true that Cohen was blessed with socio-economic privilege, both in the form of family inheritances and grants from a liberal Canadian government that supported (and continues to support, in various respects) art and artists. His heart may have been hard, but the times weren’t.

At the cultural level, good art tends to emerge from good times. It’s not even about having a well-managed social welfare state (though that, of course, helps). Rather, it seems to be a matter of liberal attitudes reproducing themselves in certain contexts, leading to greater degrees of freedom and greater gains in artistic production and sophistication.

So forget Green Day for a second. Take, as an example, the Weimar Republic of Germany’s interwar period. It was a short-lived heyday of liberalism and representative democracy, flourishing smack between two periods of staunch authoritarianism: bookended by the post-unification German Empire on one side, and Nazi Germany on the other. It was in this context that some of the twentieth century’s most compelling art was created.

* * *

It’s tricky to even think about Weimar Germany without being ensnared by the sickly succour of cliché. You know: leggy chorus girls high-kicking in all-night cabarets, gays and lesbians fraternizing freely, women in short hair lighting cigarettes while the zippy strains of jaunty jazz wafts hither and yon on in a smoky hall — a populace caught in full thrall of freedom. Fritz Lang’s 1922 film “Dr. Mabuse, The Gambler,” the opening titles of which describe it as “A Picture of the Times,” depicts Berlin’s underworld as equally rococo in its bourgeois elegance, and chaotically debased. As the proprietor of an illegal casino puts it, summing up the free-spirited ethos of the era, “Everything that pleases is allowed.”
Emerging from the horror of the First World War, and the 1918 November Revolution that saw the imperial government sacked, the nation’s consciousness was in a state of jumble and disarray. But it was an exciting  jumble, full of possibility. The philosopher Ernst Block compared Weimar Germany to Periclean Athens of the fifth century BCE: a time of cultural thriving, sovereign self-governance, and increased social and political equality. Germany became a hub for intellectualism, nurturing physicists like Einstein and the critical theorists of the Frankfurt School. Art indulged experimentalism and the avant-garde, united less by common aesthetic tendencies and more by shared socialist values. It was era of Otto Dix, Bertolt Brecht, the Bauhaus group, Arnold Schoenberg and a new, expressionist tendency in cinema.

Robert Weine’s 1919 film “The Cabinet of Dr. Caligari” embodied the spirit of this new age. It told the story of a small community preyed upon by the maniacal carnival barker Dr. Caligari (Werner Krauss), whose newest attraction is a spooky-looking sleepwalker named Cesare (the great German actor Conrad Veidt). By cover of darkness, Caligari controls Cesare, using him to commit a string of violent crimes. With its highly stylized sets, and comments on the brutality of authority, the film presented a whole alternative vision of the world. Both stylistically and thematically, “Caligari” imagined the splintering of the postwar German psyche, presenting a sense that reality itself had been destabilizing, and was reconstituting itself in jagged lines and oblique curlicues. The movie’s lasting influence is inestimable.

In his landmark work of cultural analysis, “From Caligari to Hitler: A Psychological History of the German Film,” film critic Siegfried Kracauer described the “collapse of the old hierarchy of values and conventions” in Weimar-era Germany. “For a brief while,” Kracauer writes, “the German mind had a unique opportunity to overcome hereditary habits and reorganize itself completely. It enjoyed freedom of choice, and the air was full of doctrines trying to captivate it, to lure it into a regrouping of inner attitudes.”

Certainly, German cinema of the era often explicitly figures authoritarian characters attempting to seduce the public: from Weine’s madman Dr. Caligari, to Lang’s huckster Dr. Mabuse. For the reforming national consciousness, authority served as a kind of siren song, luring the public out of the rowdy cabarets and nightclubs and back on the straight and narrow. By the early 1930s, attitudes seemed to be shifting. In Fritz Lang’s classic thriller “M,” from 1931, police sniff out a serial killer in part by trying to determine a psychosexual basis for his crimes. It was at once a strike against the unfettered sexual libertinism of the Berlin cabarets, and a sinister intimation of Nazism, which was notoriously marked by its pseudoscientific quackery about the biological basis of criminality and depravity. The hallmarks of Weimar — its authoritarian disenthrallment, its slackening attitudes toward sexual repression, its intoxicating cosmopolitanism — were curdling.

* * *

Weimar poses a number of compelling questions around the subject of historical and cultural Golden Ages. Such rigidly compartmentalized, epochal thinking leads inevitably to collapse. How, after all, can a “Golden Age” be defined without presuming its emergence from, and collapse back into, periods of relative darkness and doom? It recalls Karl Marx’s thinking on historical stages, outlined in volume one of “Capital,” and the idea that each historical period carries within it the seeds of its successor. And it is force, according to Marx, that serves as “the midwife of every old society pregnant with a new one.”

In the case of Weimar, the sense of expanded liberty was undercut in several respects. While the upper and middle classes grew in prosperity, the working poor were afflicted by hyperinflation, and by and large unaffected by new gains made in left-wing modernist painting, cabaret culture and avant-garde cinema. Sexual libertinism bred syphilis outbreaks. Old-stock Germans balked at the moral and aesthetic degeneracy of the new art movements. For such people, Weimar was regarded less like Periclean Athens and more like the ancient African port of Carthage: fit to be sacked, razed, and have its earth salted so that no memory of it could possibly proliferate.

It speaks to a certain historical tendency. To revise Marx, it’s not just that a given society is pregnant with the next one, but that it’s pregnant with resentments and reactions. With Weimar, expanded cultural and political liberalism emerged as a reaction to the authoritarianism of imperial Germany, with the even fiercer authoritarianism and violence of Hitler’s regime emerging as a response to that. Stereotypes of left-leaning artists cavorting in cabarets found their negative image, their doppelgänger, in nationalist thugs roving the streets.

This is not to say that it wasn’t a period of growth and advancement, artistically and otherwise. Rather, it’s a historical reminder that even periods that usher in all manner of artistic and cultural headway need to be relentlessly qualified. It’s not that good times don’t make for good art. It’s that, really, there’s never been such a thing as a distinctly, determinedly, wholly unequivocally “good time.” Even the most shimmering epochs exist in contradiction, conflict and often out-and-out hypocrisy. Like the backdrop of “Caligari,” ours has always been a world of light and shadow. Something to keep in mind as the world stumbles into what’s shaping up to be a new Periclean Golden Age of American Idiocy.

John Semley lives and works in Toronto. He is a books columnist at the Globe & Mail newspaper and the author of “This Is A Book About The Kids In The Hall” (ECW Press).

Obama’s farewell address: One last round of clichés and lies

631431738-jpg-crop-promo-xlarge2

By Niles Niemuth
11 January 2017

President Barack Obama capped his eight years in office with a vacuous and hypocritical farewell address Tuesday night delivered at the McCormick Place convention center in downtown Chicago.

The first-ever presidential farewell address delivered outside of Washington, DC had the atmospherics of an overblown, cheap spectacle. Obama strode onto the stage like a rock star, flanked by oversized American flags, a massive illuminated presidential seal and an introductory soundtrack by the rock band U2.

As with every address Obama has delivered over the last eight years, his speech in Chicago was full of clichés, his rhetoric padded with empty phrases and delivered with a false gravitas, signaled by his trademark pursed lips and affected whisper.

The speech was rife with contradictions, the starkest being the juxtaposition of Obama’s boasting of the great social progress achieved by his administration and his warning of threats to American democracy arising from ever-growing social inequality and economic insecurity.

The president declared: “If I had told you eight years ago that America would reverse a great recession, reboot our auto industry, and unleash the longest stretch of job creation in our history… if I had told you that we would open up a new chapter with the Cuban people, shut down Iran’s nuclear weapons program without firing a shot, and take out the mastermind of 9/11… if I had told you that we would win marriage equality, and secure the right to health insurance for another 20 million of our fellow citizens—you might have said our sights were set a little too high.

“By almost every measure, America is a better, stronger place than it was when we started.”

He made no attempt to explain why, given this impressive record of social progress and foreign policy success, his party was routed in the elections and the billionaire demagogue Donald Trump was preparing to succeed him in the White House.

A basic component of the answer, of course, is the grotesquely false rendering of his record and the state of American society as he leaves office. Hardly a week goes by without a new report on signs of extreme social crisis or ever-more obscene levels of wealth among the financial elite. Just in the past month, studies have been published showing the first decline in US life expectancy in 23 years, plunging pay for young adults, a 72 percent surge in deaths from synthetic opioids, and home ownership rates at historic lows for young people.

Other surveys have documented a $237 billion increase in the wealth of the world’s richest 200 billionaires, driven largely by the US stock market boom under Obama, and an acceleration of the transfer of wealth from the bottom half of the US population to the top one percent.

In boasting of presiding over a record number of consecutive monthly job increases, Obama neglected to mention that 94 percent of the new jobs created in the last eight years have been either part-time or temporary.

Noticeably absent from Obama’s remarks was any mention of the social conditions in the city where he was speaking, which is ravaged by high levels of poverty and unemployment, an epidemic of police killings and violence, and a skyrocketing homicide rate.

He lamented in general terms the growth of social inequality and the dangers it poses to American democracy—that is, the threat of a social explosion in the United States.

“While the top one percent has amassed a bigger share of wealth and income, too many families, in inner cities and rural counties, have been left behind—the laid-off factory worker; the waitress and health care worker who struggle to pay the bills—convinced that the game is fixed against them, that their government only serves the interests of the powerful—a recipe for more cynicism and polarization in our politics.”

As always, he spoke as if none of these social ills had anything to do with the policies pursued by his administration, including severe cuts in social spending on the one side and the bailout of the banks and flooding of money into the stock market on the other.

Another piece of monumental hypocrisy was Obama’s pose of fighting to defend democracy when he has done more to destroy it than perhaps any other US president.

“Democracy can buckle when we give in to fear,” he declared. “So just as we, as citizens, must remain vigilant against external aggression, we must guard against a weakening of the values that make us who we are. That’s why, for the past eight years, I’ve worked to put the fight against terrorism on a firm legal footing. That’s why we’ve ended torture, worked to close Gitmo, and reform our laws governing surveillance to protect privacy and civil liberties.”

This is from a president who has personally authorized the assassination of American citizens and thousands of others around the world with drones-fired missiles, protected and promoted those in the CIA responsible for torture, kept the prison at Guantanamo Bay open, persecuted journalists and jailed whistleblowers, militarized the police, and expanded the illegal surveillance of electronic communications.

Obama also used his farewell address take parting shots at Russia and China, lumping the war against ISIS with efforts to counter both countries, and arguing that aggressive action against the world’s second- and third-largest nuclear-armed powers was the only way to avoid war.

“[T]he fight against extremism and intolerance and sectarianism are of a piece with the fight against authoritarianism and nationalist aggression,” he said. “If the scope of freedom and respect for the rule of law shrinks around the world, the likelihood of war within and between nations increases, and our own freedoms will eventually be threatened.”

Obama spent his eight years in office waging war abroad and war on the working class at home. With Tuesday’s speech, he passed the reins to Trump with a shrug.

WSWS

47% of Jobs Will Disappear in the next 25 Years

Article Image
British Musicians. Ms. Dynamite. Getty Images.

The Trump campaign ran on bringing jobs back to American shores, although mechanization has been the biggest reason for manufacturing jobs’ disappearance. Similar losses have led to populist movements in several other countries. But instead of a pro-job growth future, economists across the board predict further losses as AI, robotics, and other technologies continue to be ushered in. What is up for debate is how quickly this is likely to occur.

Now, an expert at the Wharton School of Business at the University of Pennsylvania is ringing the alarm bells. According to Art Bilger, venture capitalist and board member at the business school, all the developed nations on earth will see job loss rates of up to 47% within the next 25 years, according to a recent Oxford study. “No government is prepared,”The Economist reports. These include blue and white collar jobs. So far, the loss has been restricted to the blue collar variety, particularly in manufacturing.

To combat “structural unemployment” and the terrible blow it is bound to deal the American people, Bilger has formed a nonprofit called Working Nation, whose mission it is to warn the public and to help make plans to safeguard them from this worrisome trend. Not only is the entire concept of employment about to change in a dramatic fashion, the trend is irreversible. The venture capitalist called on corporations, academia, government, and nonprofits to cooperate in modernizing our workforce.

To be clear, mechanization has always cost us jobs. The mechanical loom for instance put weavers out of business. But it’s also created jobs. Mechanics had to keep the machines going, machinists had to make parts for them, and workers had to attend to them, and so on. A lot of times those in one profession could pivot to another. At the beginning of the 20thcentury for instance, automobiles were putting blacksmiths out of business. Who needed horseshoes anymore? But they soon became mechanics. And who was better suited?

A Toyota plant, Japan. Manufacturing is almost fully automated today and so many other jobs are not far behind.

Not so with this new trend. Unemployment today is significant in most developed nations and it’s only going to get worse. By 2034, just a few decades, mid-level jobs will be by and large obsolete. So far the benefits have only gone to the ultra-wealthy, the top 1%. This coming technological revolution is set to wipe out what looks to be the entire middle class. Not only will computers be able to perform tasks more cheaply than people, they’ll be more efficient too.

Accountants, doctors, lawyers, teachers, bureaucrats, and financial analysts beware: your jobs are not safe. According to The Economist, computers will be able to analyze and compare reams of data to make financial decisions or medical ones. There will be less of a chance of fraud or misdiagnosis, and the process will be more efficient. Not only are these folks in trouble, such a trend is likely to freeze salaries for those who remain employed, while income gaps only increase in size. You can imagine what this will do to politics and social stability.

Mechanization and computerization cannot cease. You can’t put the genie back in the bottle. And everyone must have it, eventually. The mindset is this: other countries would use such technology to gain a competitive advantage and therefore we must adopt it. Eventually, new tech startups and other business might absorb those who have been displaced. But the pace is sure to move far too slowly to avoid a major catastrophe.

According to Bilger, the problem has been going on for a long time. Take into account the longevity we are enjoying nowadays and the US’s broken education system and the problem is compounded. One proposed solution is a universal basic income doled out by the government, a sort of baseline one would receive for survival. After that, re-education programs could help people find new pursuits. Others would want to start businesses or take part in creative enterprises. It could even be a time of the flowering of humanity, when instead of chasing the almighty dollar, people would able to pursue their true passions.

On a recent radio program, Bilger talked about retooling the education system in its entirety, including adding classes that are sure to transfer into the skills workers need for the jobs that will be there. He also discussed the need to retrain middle-aged workers so that they can participate in the economy, rather than be left behind. Bilger said that “projects are being developed for that.” Though he admits that many middle-aged workers are resistant to reentering the classroom, Bilger says it’s necessary. What’s more, they are looking at ways of making the classroom experience more dynamic, such as using augmented reality for retraining purposes, as well as to reinvent K-12 education. But such plans are in the seminal stages.

Widespread internships and apprenticeships are also on the agenda. Today, the problem, as some contend, is not that there aren’t enough jobs, but that there aren’t enough skilled workers to fill the positions that are available. Bilger seems to think that this problem will only grow more substantial.

But would those who drive for a living, say long haul truckers and cab drivers, really find a place in the new economy with retraining, once self-driving vehicles become pervasive? No one really knows. Like any major shift in society, there are likely to be winners and losers. This pivot point contains the seeds for a pragmatic utopia, or complete social upheaval, but is likely to fall somewhere between.

Bilger ended the interview saying, “What would our society be like with 25%, 30% or 35% unemployment? … I don’t know how you afford that, but even if you could afford it, there’s still the question of, what do people do with themselves? Having a purpose in life is, I think, an important piece of the stability of a society.”

 

 

The Breakdown of Empathy and the Political Right in America

THE RIGHT WING
What child psychology can teach us about the current GOP.

Photo Credit: a katz/Shutterstock

In 1978, developmental psychologist Edward Tronick and his colleagues published a paper in the Journal of the American Academy of Child Psychiatry that demonstrated the psychological importance of the earliest interactions between mothers and babies. The interactions of interest involved the playful, animated and reciprocal mirroring of each other’s facial expressions. Tronick’s experimental design was simple: A mother was asked to play naturally with her 6-month-old infant. The mother was instructed to suddenly make her facial expression flat and neutral; to remain completely still, for three minutes, regardless of her baby’s activity. Mothers were then told to resume normal play. The design came to be called the “still face paradigm.”

When mothers stopped their facial responses to their babies, when their faces were still, babies first anxiously strove to reconnect with their mothers. When the mothers’ faces remained neutral and still, the babies quickly showed ever-greater signs of confusion and distress, followed by a turning away from the mother, finally appearing sad and hopeless. When the mothers in the experiment were permitted to re-engage normally, their babies, after some initial protest, regained their positive affective tone and resumed their relational and imitative playfulness.

When a primary caretaker (the still-face experiments were primarily done with mothers, not fathers) fails to mirror a child’s attempts to connect and imitate, the child becomes confused and distressed, protests, and then gives up. Neurobiological research (summarized by child psychiatrist Bruce Perry and science writer Maia Szalavitz in their book, Born for Love: Why Empathy Is Essential—and Endangered), has powerfully demonstrated that in humans and other mammals, a caretaker’s attunement and engagement is necessary to foster security, self-regulation and empathy in the developing child. Parental empathy stimulates the growth of empathy in children. The infant brain is a social one and is ready to respond to an environment that is appropriately nurturing.

On the other hand, when the environment is inattentive and not empathetic, the child’s stress response system, embedded as it is in the architecture of the child’s developing nervous system (mediators in this system include oxytocin, opiate and dopamine receptors, cortisol levels and parasympathetic nerve pathways), is overwhelmed and many types of psychopathology result. Higher cognitive functions, including language, can suffer as the brain instinctively relies on more primitive regions to deal with an unresponsive environment.

The worst scenarios are ones occurring in conditions over which children have no control, such as the dangers faced by the babies in the still-face experiments. When we are powerless to prevent our nervous systems and psyches from being overwhelmed, our physical, emotional and intellectual development is disrupted. We call this trauma.

As a metaphor for adult life in contemporary society, the still-face paradigm—the helplessness intrinsic to it and the breakdown of empathy that lies at its foundation—aptly describes the experience of many people as they interact with the most important institutions in their lives, including government. And as with Tronick’s babies and mothers, when our social milieu is indifferent to our needs and inattentive to our suffering, widespread damage is done to our psyches, causing distress, anger and hopelessness. Such inattention and neglect lead to anxiety about our status and value, and a breakdown of trust in others.

The pain of the still face in American society is present all around us. People feel it while waiting for hours on the phone for technical support, or when dealing with endless voicemail menus while on hold with the phone or cable company, or waiting to get through to their own personal physician. They feel it in schools with large class sizes and rote teaching aimed only at helping students pass tests. They feel it when crumbling infrastructure makes commuting to work an endless claustrophobic nightmare. And too often, they feel it when interacting with government agencies that hold sway over important areas of their lives, such as social services, the IRS, building permit and city planning departments, or the motor vehicle department. Like Tronick’s babies, citizens who look to corporations and government for help, for a feeling of being recognized and important, are often on a fool’s errand, seeking recognition and a reciprocity that are largely absent.

This problem is greatly exaggerated by the profoundly corrosive effects of social and economic inequality. Under conditions of inequality, the vulnerability of those seeking empathy is dramatically ramped up, leading to various forms of physical and psychological breakdowns. In a classic epidemiological study by Richard Wilkinson and Kate Pickett, researchers found a strong correlation between the degree of inequality in a country (or a state, for that matter) and such problems as rates of imprisonment, violence, teenage pregnancies, obesity rates, mental health problems such as anxiety, depression and addiction, lower literacy scores, and a wide range of poor health outcomes, including reduced life expectancy. Wilkinson and Pickett’s key finding is that it is the inequality itself, and not the overall wealth of a society, that is the key factor in creating these various pathologies. Poorer places with more equality do better than wealthy ones marked by gross inequality.

Inequality makes people feel insecure, preoccupied with their relative status and standing and vulnerable to the judgment of others, and it creates a greater degree of social distance between people that deprives them of opportunities for intimate and healing experiences of recognition and empathy.

But as the still-face experiments show, human beings are primed from birth to be social, to seek out empathic and attuned responses from others, and to develop the psychobiological equipment to respond in kind. Still-face bureaucracies and the powerlessness that marks systems of income inequality contradict our very natures. As Wilkinson and Pickett put it, “For a species which thrives on friendship and enjoys cooperation and trust, which has a strong sense of fairness, which is equipped with mirror neurons allowing us to learn our way of life through a process of identification, it is clear that social structures based on inequality, inferiority and social exclusion must inflict a great deal of social pain.”

This pain is increasingly prevalent among working- and middle-class Americans who have seen their jobs lost to technology and globalization, their incomes stagnate and the promise of a better life for their children appear increasingly unlikely. Their interactions with their doctors, pharmacists, bankers, landlords, state and federal tax collectors, social service agencies, auto dealers, and cable providers are too often marked by frustration and feelings of dehumanization. Like Tronick’s infants, they can’t get anyone even to see them, much less smile at them.

To make matters worse, they also live in a meritocratic culture that blames the victim, even while these victims have little power to escape their lot. The old adage that it’s lonely at the top and that Type-A executives have more than their share of stress is false. Studies on stress show that what is most stressful isn’t being in charge but being held accountable for outcomes over which you have little or no control.

The painful interaction of inequality and indifference is especially poignant and strongly felt by groups in our society who bear the brunt of discrimination. People of color, immigrants and the LGBT community are especially traumatized by the still face of social and political invisibility, of the demeaning effects of prejudice and institutional bias. They are in the most dire need of empathy, yet they are the least likely to get it.

As studies of infants and the development of children have shown, empathy is essential to build our capacity to deal with pain and adversity and to develop into social empathic beings. Without empathy, we get overwhelmed and either go about our lives in a fight-or-flight state of hyper-vigilance or else we retreat and surrender to feelings of hopelessness and despair. Thus, while empathy depends on being accurately and frequently understood in social interactions, our society is increasingly one in which people can’t find responsive faces or attuned reliable relationships anywhere.

This absence isn’t simply an individual matter. Household size has shrunk. The average number of confidantes people have has sharply shrunk over the last few decades, from three in 1985 to two in 2004, with a full quarter of Americans reporting that they have no confidantes at all. Time spent socializing with friends or having family dinners has similarly declined. The last five decades have witnessed stunning declines in virtually every form of social and civic participation, spaces where people can encounter each other, face to face, in their communities, including churches, social clubs, the PTA, and even, according to sociologist Robert Putnam, bowling leagues.

The number of hours that children spend playing outside in unstructured activities—critical for the development of social skill and empathy—was reduced by 50 percent between 1981 and 1997, a loss compensated for by radical increases in time spent watching television or sitting in front of computer screens. On average, American kids watch two to four hours of television daily. And consider this: 43 percent of children under two watch television or videos every day. Children need face-to-face human interaction. Digital substitutes just won’t do.

On nearly all measures of social life, Americans tend to have fewer and lower quality interactions with one another than their parents or grandparents did. Isolation has grown along with inequality. They go together. Societies with more economic fairness and equality are ones that encourage and privilege cooperation and mutuality. Societies like ours that are so exceptionally unequal encourage and privilege aggression, paranoia and competitiveness, traits associated with the so-called rugged individualist. While sometimes adaptive, such an ideal also makes a virtue out of disconnection and trauma.

The links between the failures of empathy in childhood and similar experiences in adult social and political life are not simple or straightforward. We cannot reduce white working-class anger, for example, to childhood traumas, and it is certainly true that the feelings of neglect and rejection associated with encountering the still face of social institutions are ubiquitous and not restricted to the economically disadvantaged. As I already said, people of color, the majority of the working class, endure this neglect and rejection in especially harsh ways. Race matters, but so does wealth. It remains true that wealth and income can enhance feelings of agency and control and can “buy” greater responsiveness from those we need help or support from.

To get a deeper understanding of the intersection of politics and the psychobiology of empathy and trauma, we need a deeper and more nuanced account of the interior lives of the working and middle-class people who have been left behind in our society. Berkeley sociologist Arlie Hochschild offers such an account in her recent book, Strangers in Their Own Land: Anger and Mourning on the American Right. Based on her years embedded with Tea Party sympathizers and activists in southwest Louisiana, she describes what she calls the “deep story” of the white working-class people she got to know. For Hochschild, a deep story is a person’s subjective emotional experience, free of judgment and facts. It is the subjective prism through which all people—in this case, Tea Party voters—see the world.

Hochschild presents their story in a metaphorical way that represents the hopes, fears, shame, pride and resentment in the lives of her informants. It’s a story of people for whom there is no fairness; people who see the still face of government smiling on others but not on them. In fact, Hochschild’s subjects perceive the faces of many people in American society (for example, liberals living on the coasts) looking at them with disdain or contempt, not smiling in recognition or understanding.

The following is an edited version of this deep story:

You are patiently waiting in a long line leading up a hill… you are situated in the middle of this line, along with others who are also white, older, Christian, and predominantly male, some with college degrees, some not.

Just over the brow of the hill is the American Dream, the goal of everyone waiting in line. Many in the back of the line are people of color—poor, young and old, mainly without college degrees. It’s scary to look back; there are so many behind you. In principle you wish them well. Still, you’ve waited a long time, worked hard and the line is barely moving. You deserve to move forward a little faster.

You’ve suffered long hours, layoffs and exposure to dangerous chemicals at work, and received reduced pensions. You have shown moral character through trial by fire, and the American dream of prosperity and security is a reward for all of this, showing who you have been and are—a badge of honor. Will I get a raise? Are there good jobs for us all? 

The line is moving backward! You haven’t gotten a raise in years and your income has dropped. You’re not a complainer. But this line isn’t moving.

Look! You see people cutting in line ahead of you! You’re following the rules. They aren’t. Some are black… affirmative action. Women, immigrants, refugees, public sector workers—where will it end? If you are a man, [there are] women demanding the right to the men’s jobs… and overpaid public sector employees… who seem to you to work shorter hours in more secure and overpaid jobs, enjoying larger pensions than yours… Four million Syrian refugees fleeing war and chaos… even the brown pelican which is protected as an endangered species… even they have cut in line. You feel betrayed.

In this story, the economy and government are indifferent to the people in the middle of the line. Their sacrifice is ignored. And other people seem to be getting the smiles that should shine on them. It’s as if the mother in the still-face paradigm not only didn’t respond to her child’s attempt to engage, but instead looked the other way and smiled at someone else. Their resentments are stereotyped as intrinsically racist or misogynist, while their own claim to victimhood is discounted.

While this story is not only racist, it clearly taps into racist sentiments. It is important to be clear about the difference between the subjective experience of white working-class men and the reality. Poor and middle-class whites have been sensitized to the sounds of racist dog-whistles for generations. The right-wing media machine, one that has reached its zenith in the Trump campaign, has stoked the fires of the scapegoating reflex that always seems to lie just beneath the surface of the psyches of victimized whites. It’s important to pause and recognize that the propagandistic xenophobia of the right has helped propagate the deep story Hochschild so empathetically tells. No one, in fact, is actually “cutting in line”—not people of color, immigrants or LGBT people. While it is still important to understand the subjective experience of her subjects in the deepest possible way, we must also recognize the play of hidden ideologies.

The failure of our institutions to empathize with the plight of the middle and working classes, to recognize their sacrifice and reward their hard work is traumatic. It is the same type of trauma that children experience when their caretakers are preoccupied or rejecting. The trauma erodes trust. It overwhelms systems that people have developed to deal with stress and creates psychological suffering and illness.

Adults, like children, try to cope with the stress of failures of recognition in the best ways they can. They certainly get anxious and depressed and may turn to drugs and alcohol to manage these painful feelings. In addition, when social trust is weakened and people are isolated, they try to find ways to belong, to be part of a community. The Tea Party is one such community. Others turn to their church communities. Their social brains seek an experience of “we” and often do so by creating a fantasy of a “them” whom they can devalue and fight. Tribalism draws on our need for relatedness, but tragically, can also pervert it. Rejected by employers and government, they reject and demean others. All the while, they are trying to deal with the pain, powerlessness and lack of empathy they experience in their social lives.

Donald Trump clearly spoke to this pain. He empathized with the traumatic losses and helplessness of the white middle and working classes. He helped them feel part of something bigger than themselves, a movement that combatted their isolation. And he helped restore their feeling of belonging by positioning them against demeaned others, primarily immigrants and countries on the other end of “horrible trade deals.”

The research on the development of empathy and the trauma resulting from its absence, on the links between economic inequality and physical and psychological suffering, and on the corrosive effects of social isolation has to lead progressives to renew their campaign for radical reforms of our economy and politics. The research by Edward Tronick and others on the development of empathy and the trauma resulting from its absence has to lead us to support families in every way possible such that parents have the time and resources to empathetically connect with their children. Wilkinson and Pickett’s research on the harmful effects of economic inequality should force us to make redistribution the centerpiece of our political program, just as it was for Bernie Sanders. Their research clearly shows us that greater equality can ameliorate a wide range of suffering.

The fact that our society disconnects us from each other means that we have to seek common ground with the people on the other side of what Hochschild calls the “empathy wall.” We have to communicate to them that we not only feel their pain, but that we share it, and that in the end, we are all in this together.

Michael Bader is a psychologist and psychoanalyst in San Francisco. He is the author of “More Than Bread and Butter: A Psychologist Speaks to Progressives About What People Really Need in Order to Win and Change the World” (Blurb, 2015).

 

Alternet

Trump loves to talk about his intellectual gifts and academic success. Shockingly, none of it is true.

Donald Trump’s questionable intelligence: All those false claims about his academic record and derision of others bespeak profound insecurity

SKIP 

Donald Trump's questionable intelligence: All those false claims about his academic record and derision of others bespeak profound insecurity
(Credit: Getty/Chip Somodevilla)

Donald Trump says he doesn’t need daily intelligence briefings.

“I’m, like, a smart person,” he told Fox News’ Chris Wallace last weekend, explaining why he’ll be the first president since Harry Truman to avoid getting daily updates from intelligence professionals about national security threats.

During and since his campaign, Trump has evoked these two themes. First, he’s skeptical of intelligence. Second, he’s smart.

The first is obviously true. The second is a matter of dispute.

Trump frequently boasts about how smart he is. Anyone who feels compelled to tell everyone how smart he is is clearly insecure about his intelligence and accomplishments. In Trump’s case, he has good reason to have doubts.

Trump has the kind of street smarts (what he calls “gut instinct”) characteristic of con artists, but his limited vocabulary, short attention span, ignorance of policy specifics, indifference to scientific evidence and admitted aversion to reading raise questions about his intellectual abilities.

In 2004, in an interview with CNN, Trump said, “I went to the Wharton School of Finance. I got very good marks. I was a good student. It’s the best business school in the world, as far as I’m concerned.”

Trump has repeated that claim many times since. Each time, it isn’t clear if he’s trying to convince his interviewer or himself.

In 2011, in an interview with ABC, Trump said: “Let me tell you, I’m a really smart guy. I was a really good student at the best school in the country,” referring once again to Wharton, the University of Pennsylvania’s business school, where he earned a bachelor’s degree in 1968.

“I went to the Wharton School of Finance,” he said during a speech in Phoenix in July of 2015. “I’m, like, a really smart person.”

In an interview on NBC’s “Meet the Press” in August 2015, Trump described Wharton as “probably the hardest there is to get into.” He added, “Some of the great business minds in the world have gone to Wharton.” He also observed: “Look, if I were a liberal Democrat, people would say I’m the super genius of all time. The super genius of all time.”

During a CNN-sponsored Republican town hall in Columbia, South Carolina, last February, Trump reminded the audience that he had gone to Wharton and then repeated the same boast: “Look, I went to the best school, I was a good student and all of this stuff. I mean, I’m a smart person.”

Trump frequently communicates via Twitter, which is not a good venue for displaying one’s linguistic prowess, but many observers have noted that Trump has a difficult time expressing himself and speaking in complete sentences. A linguistic analysis last year by Politico found that Trump speaks at a fourth-grade level. A study by researchers at Carnegie-Mellon University compared this year’s Republican and Democratic presidential candidates in terms of their vocabulary and grammar. Trump’s scored at a fifth-grade level, the lowest of all the candidates. Some might suspect that this is not an intellectual shortcoming but instead Trump’s calculated way of communicating with a wide audience. But Tony Schwartz, who spent a great deal of time with the developer while ghostwriting his book “The Art of the Deal,”noted that Trump has a very limited vocabulary. It would hardly be surprising if these observations infuriated the vain and insecure Trump.

Trump’s persistent insults directed toward anyone who disagrees with him also suggest deep insecurity. Before, during and since his presidential campaign, Trump has constantly denigrated his opponents and detractors  — among them actresses Rosie O’Donnell and Cher, businessman Mark Cuban, GOP political operatives Karl Rove and Ana Navarro, NBC’s Chuck Todd, Jeb Bush, Weekly Standard editor Bill Kristol and conservative columnist George Will — as “losers.” It turns out that this is one of Trump’s favorite words. An archive of Trump’s Twitter account since 2009 found that he used the word “loser” 235 times. His other favorite insults include “dumb” or “dummy” (222 tweets), “terrible” (202), “stupid” (182), “weak” (154) and “dope” (115).

For example, on May 8, 2013, at 6:37 p.m., Trump tweeted: “Sorry losers and haters, but my I.Q. is one of the highest -and you all know it! Please don’t feel so stupid or insecure, it’s not your fault.”

At 3:52 pm on Sept. 26, 2014 – nine months before he announced his candidacy for the White House – Trump tweeted: “I wonder if I run for PRESIDENT, will the haters and losers vote for me knowing that I will MAKE AMERICA GREAT AGAIN? I say they will!”

In contrast to his attacks on “losers,” Trump frequently retweets comments from others congratulating him for how “smart” he is.

Only someone who doubts his own intelligence would feel compelled to make these kinds of public statements.

Trump surely knows that he didn’t get into Wharton on his own merits. He transferred into Wharton’s undergraduate program after spending two years at Fordham University in New York. According to Gwenda Blair’s 2001 biography, “The Trumps,” Trump’s grades at Fordham were not good enough to qualify him for a transfer to Wharton. Blair wrote that Trump got into Wharton as a special favor from a “friendly” admissions officer who knew Trump’s older brother, Freddy. The college’s admissions staff surely knew that Trump’s father was a wealthy real estate developer and a potential donor.

Moreover, Trump has exaggerated his academic accomplishments at Wharton. On at least two occasions in the 1970s — “A Builder Looks Back-and Moves Forward,” on Jan. 28, 1973, and “Donald Trump, Real Estate Promoter, Builds Image as He Buys Buildings,” on Nov. 1, 1976 — the New York Times reported that Trump “graduated first in his class” at Wharton in 1968.

That’s not true. The dean’s list for his graduation year, published in the Daily Pennsylvanian, the campus newspaper, doesn’t include Trump’s name. He has refused to release his grade transcripts from his college days.

It is likely that Trump was the original source for that falsehood, but it isn’t entirely clear, since neither Times article attributes it directly to him. But the fabrication that Trump was first in his class has been repeated in many other articles as well as books about Trump, so he clearly knew that it was out there in the public domain and has never bothered to correct it.

“He was not in any kind of leadership. I certainly doubt he was the smartest guy in the class,” Steve Perelman, a classmate of Trump’s at Wharton, told the Daily Pennsylvanian last year.

Trump’s insecurity about his accomplishments is also revealed in his efforts to portray himself as an up-by-the-bootstraps self-made entrepreneur.

“It has not been easy for me,” Trump said at a town hall meeting on Oct. 26, 2015, acknowledging, “My father gave me a small loan of a million dollars.”

At a news conference early this year, Trump repeated the same story: “I got a very, very small loan from my father many years ago. I built that into a massive empire and I paid my father back that loan.”

An investigation by the Washington Post in March demolished Trump’s claim that he made it on his own. Not only did Trump’s multi-millionaire father Fred provide Donald with a huge inheritance, and set up big-bucks trust accounts to provide his son with a steady income, Fred was also a silent partner in Trump’s first real estate projects:

Trump’s father — whose name had been besmirched in New York real estate circles after investigations into windfall profits and other abuses in his real estate projects — was an essential silent partner in Trump’s initiative. In effect, the son was the front man, relying on his father’s connections and wealth, while his father stood silently in the background to avoid drawing attention to himself.

Fred Trump’s real estate fortune was hardly due to his faith in the free market, but instead stemmed from his reliance on government subsidies. He made his money building middle-class apartments financed by the Federal Housing Administration (FHA). In 1954, when Donald was 8 years old, his father was subpoenaed to testify before the Senate Banking Committee on allegations that he had ripped off the government to reap windfall profits through his FHA-insured housing developments. At the hearings, the elder Trump was called on the carpet for profiteering off public contracts, including overestimating the construction costs of his projects in order to get larger mortgages from FHA. Under oath, he reluctantly admitted that he had wildly overstated the development costs.

Donald has followed in his father’s corrupt footsteps. Trump’s career is littered with bogus businesses (like Trump University); repeated ripoffs of suppliers, contractors and employees whom he failed to pay for services rendered; and the misuse of the Trump Foundation to feather his own nest while trying to look like a philanthropist. Six of Trump’s businesses have gone bankrupt. Despite this, on April 18, 2015, Trump tweeted this falsehood: ”For all of the haters and losers out there sorry, I never went Bankrupt.”

Trump has also lied about the size of his wealth, as various business publications have pointed out. Many observers suggest that one reason Trump has refused to release his tax returns is that they will show that he has repeatedly and wildly exaggerated his wealth and thus his success.

Given this background — his lackluster academic record, his dependence on his family’s connections and wealth to get into college and to succeed in business, and his troublesome and abusive business practices — it shouldn’t be surprising that Trump is so insecure about his intellect and so thin-skinned about his accomplishments.

Peter Dreier is professor of politics and chair of the Urban & Environmental Policy Department at Occidental College. His most recent book is “The 100 Greatest Americans of the 20th Century: A Social Justice Hall of Fame” (Nation Books).

Saving art from looting and destruction — especially in the Middle East — is a military matter

A few good Monuments Men:

The British Army is recruiting experts who fancy themselves George Clooney 2.0 to preserve global cultural treasure

A few good Monuments Men: Saving art from looting and destruction — especially in the Middle East — is a military matter

(Credit: Columbia Pictures)

The British Army recently announced that it would be recruiting 15 to 20 new officers with specializations in art, archaeology and antiquities who will be deployed in the field, just behind the front lines, to help identify, protect and track art and antiquities that are in danger of being damaged, looted or destroyed.

This is, of course, particularly relevant to Middle Eastern conflicts, where groups like ISIS have shown a giddy eagerness to destroy ancient monuments, on the scale of whole cities like Palmyra and Nimrud, as well as individual pre-Islamic statuary that are deemed heretical. Of course, the flip side to this is that these groups are also happy to profit off the very objects they condemn, and they are funding their activities through illicit trade in antiquities.

These works of great artistry and historical importance — which cannot exactly be blamed for not being Islamic artworks, if they were created a millennium before the Prophet Muhammad was born — can be saved from the mallet if they are destined for the auctioneer’s hammer.

Not long ago, scholars like me (I’m a specialist in the history of art crime) had to work hard to convince not just the general public but also authorities, police and politicians that art crime, particularly illicit trade in antiquities, funded organized crime and terrorism. No longer. This stance has been vindicated, unfortunately, in scores of destructive ways, most obviously through ISIS videos of iconoclasm. The only remaining questions concern the scale of the earnings from looted antiquities and what to do to stem the flow.

The most direct way to curb the looting is to discourage First World buyers from purchasing anything that is not 100 percent clearly not from recent excavations in Middle Eastern conflict zones. But while those in the art trade talk a good game, there’s profit to be had, from major galleries and online auction sites (where it is easy for a seller to hide his or her identity, difficult to be sure of an object’s provenance, and where some objects have been advertised as being still covered in desert sand, as if this were a selling point). The documentary “Blood Antiques” chillingly shows how certain Brussels art dealers, for instance, collude with actors posing as people with looted antiquities to sell, even as some still have desert sand on them.

So the need is clear to help protect surviving monuments from iconoclasts and do what we can to limit the funds for fundamentalist groups. Curbing art crime is one way to do that. The U.K. and France are among the governments that have recently dedicated tens of millions to protecting cultural heritage. The National Endowment for the Humanities in the United States recently launched a new grant for projects with that same goal. So it stands to reason that the military would reinstate officers who might be described as modern-day Monuments Men.

***

It is fitting that the Monuments Men 2.0 should be spearheaded by the British because they were the masterminds behind the original incarnation. The highest-profile among the officers were Americans, promoted by recent books like “Monuments Men,” which focused on George Stout (whose fictional avatar was played by Clooney in his film of the same name) and my own “Stealing the Mystic Lamb,” which focused on Monuments Men Robert Posey and Lincoln Kirstein. (Fictionalized versions of the latter were played by Bill Murray and Bob Balaban in the “Monuments Men”  film.

But the core of the program was British, led by Sir Leonard Woolley, a rumpled, opinionated archaeologist, a buddy of Agatha Christie and someone in desperate need of a rollicking biography, preferably penned by the master of World War II intrigue romps, Ben MacIntyre. Woolley’s project has its origins in January 1943, when archaeologists Mortimer Wheeler and John Ward-Perkins visited archaeological sites like Leptis Magna in North Africa.

A team of Italian archaeologists had recently excavated the site, but the excavated objects were just left there unprotected. British soldiers were inadvertently damaging ruins without realizing that the stones were of cultural or historical importance. The two prepared homemade “out of bounds” signs to show people where to avoid treading.

Thus began a movement to educate soldiers in the field, for with knowledge of the treasures they might encounter came consideration to protect them. What Wheeler conceived was later enacted by Woolley, as part of the British War Office. But Woolley envisioned this as an advisory role not one that would actually send officers into war zones. That would be the American contribution.

Meanwhile, in the United States at the outset of World War II, a group of American museum leaders, including Paul Sachs of Harvard’s Fogg Museum and Alfred Barr of the Museum of Modern Art, drew up a list of cultural heritage objects that might be in danger in the course of the fighting on continental Europe. This list was linked to maps that were distributed to officers, along with General Dwight Eisenhower’s important directive to avoid damaging cultural sites whenever possible. There is no record of a previous general making such a statement.

Noah Charney is a Salon arts columnist and professor specializing in art crime, and author of “The Art of Forgery” (Phaidon).