How Diversity Makes Us Smarter

Being around people who are different from us makes us more creative, more diligent and harder-working

Credit: Edel Rodriguez

IN BRIEF

  • Decades of research by organizational scientists, psychologists, sociologists, economists and demographers show that socially diverse groups (that is, those with a diversity of race, ethnicity, gender and sexual orientation) are more innovative than homogeneous groups.
  • It seems obvious that a group of people with diverse individual expertise would be better than a homogeneous group at solving complex, nonroutine problems. It is less obvious that social diversity should work in the same way—yet the science shows that it does.
  • This is not only because people with different backgrounds bring new information. Simply interacting with individuals who are different forces group members to prepare better, to anticipate alternative viewpoints and to expect that reaching consensus will take effort.

The first thing to acknowledge about diversity is that it can be difficult. In the U.S., where the dialogue of inclusion is relatively advanced, even the mention of the word “diversity” can lead to anxiety and conflict. Supreme Court justices disagree on the virtues of diversity and the means for achieving it. Corporations spend billions of dollars to attract and manage diversity both internally and externally, yet they still face discrimination lawsuits, and the leadership ranks of the business world remain predominantly white and male.

CONTINUED:

https://www.scientificamerican.com/article/how-diversity-makes-us-smarter/

Climate change brings a potentially devastating “atmospheric river” to California

Megastorms vs. megadroughts:

After years of drought, the Golden State is hit by epic storms — and it’s just the beginning of climate chaos

Megastorms vs. megadroughts: Climate change brings a potentially devastating "atmospheric river" to California
Michelle Wolfe, who had to evacuate her nearby mobile home, looks out toward flooded vineyards in the Russian River Valley, Monday, Jan. 9, 2017, in Forestville, Calif. (Credit: AP/Eric Risberg)

As the incoming Trump administration turns Washington increasingly freakish and bizarre, reinventing government as reality show, Mother Nature is doing something equally dramatic 3,000 miles away. Donald Trump can deny climate change all he wants to, but Californians can’t escape the contrasting weather extremes it’s already causing or affecting. We’re in a cycle of ever more serious droughts broken by more intense storms — harbingers of much more serious challenges to come. What’s happening in California now serves to underscore long-term realities, regardless of the day-to-day fantasies of those who temporarily hold political power.

A series of storms from the vicinity of Hawaii, known as the “Pineapple Express,” have drenched California and parts of Nevada, signaling a likely end to four years of severe drought. Just during the storm that hit Jan. 7 to 10, there were 52 reports of extreme precipitation (meaning more than eight inches of rain in a three-day period), with several measuring twice that. Strawberry Valley, on the western slopes of the Sierra Nevadas, got an amazing 20.51 inches of rain during that storm — more than Los Angeles typically gets in an entire year.

The Pineapple Express is just one example of a worldwide phenomenon known as “atmospheric rivers” or ARs. These are jet streams of moist air, tens to hundreds of miles wide, that can carry roughly 10 times as much water vapor as the Mississippi River at its mouth. Powerful as the current set of AR storms are, they pale in comparison to the month-long storms of 1861-2 that flooded much of the state, creating a 300-mile lake in the San Joaquin Valley. But even worse is possible. In 2011, the U.S. Geological Survey did a study of what a 1,000-year atmospheric river storm — known as ARkStorm — would do. Projected losses were staggering, including property losses around $400 billion (more than three Hurricane Katrinas) with another $325 billion in losses due to business interruption, lasting as long as five years. So Californians are lucky today.

View image on Twitter

The percentage of the state that is defined as “drought-free” has almost doubled overnight, from about 18 percent to 34.5 percent, according to the U.S. Drought Monitor. The drought-free area is largely in less-populated Northern California, above an east-west line running from San Francisco to Lake Tahoe, but there are broader signs of hope for the whole state.

“This is likely to be the end of the surface-storage drought for most of the state,” wrote water expert Jeffrey Mount, of the Public Policy Institute of California. With a few more days of rain, he predicted, “almost all the major reservoirs will be at or above their seasonal averages … conditions we have not seen in six years. This is great news since reservoirs are the primary source of water for cities and farms.”

Still, the good news has to be sharply qualified. Even before California’s latest drought, a much longer, continent-wide drought was underway, as shown in this panel of eight annual drought maps from the 2009 paper “Megadroughts in North America” by Edward Cook and co-authors. In a related document, they show that during the medieval period, from 1021 to 1382, the majority of the continental U.S. experienced four megadroughts lasting 22 to 40 years, interspersed with occasional isolated non-drought years. These were three to four times longer than similar modern multi-year droughts from 1855 to 1957, which ranged from seven to 10 years. Thus, California’s climate this century is already atypical for the modern era. The state may already be in the middle of a medieval-style megadrought. The state needs more than one good year of rain to begin breathing easier.

The underlying science behind these phenomena is increasingly coming into focus, according to Marty Ralph, director of the Center for Western Weather and Water Extremes at Scripps Institution of Oceanography. “It has been shown that in major parts of the West drought is due to a reduced amount of precipitation from the wettest days, many of which are AR events,” Ralph told Salon.

“We have also known for about 10 years now that most of the big flooding events in the West Coast, at least, are a result of atmospheric rivers. These findings are especially strong in the West Coast and Southwest, and in Western Europe. Thus, indeed, the future of drought and flood in this region hinges on the fate of ARs. And climate models vary substantially in how they handle this.”

What is certain is that both extreme drought and extreme AR storms, driven by global climate change, pose growing challenges to California and many other places in the decades ahead. The divergent extremes place increasing stress on the whole ecosystem, as well as its physical underpinnings. “It’s a really bad combination of two extremes,” MIT’s Adam Schlosser told Pacific Standard. “The drought dries, and, in some sense, cooks up the ground. It becomes more susceptible to heavy rain. You’re putting together a meta-event that could be quite destructive.”

Schlosser was discussing a paper to which he contributed projecting that California will experience three more extreme precipitation events per year by 2100, although the number could be reduced by half that if aggressive policy measures are pursued. These results are more dramatic, but point in the same direction, as research published last summer by Christine Shields and Jeffrey Kiehl at the National Center for Atmospheric Research in Colorado.

Shields agreed with Schlosser’s warning. “Drought-stricken areas can be significantly damaged by heavy flood,” she told Salon. Although she hadn’t yet read Schlosser’s paper, she warned against overemphasizing any differences. “The different climate projections found in the literature may be due to, in part, a difference in the way the ARs are defined and tracked,” she noted.

It’s also important to distinguish between different measures. “Intensity of rain is not the same thing as overall rain totals, or mean [average] rain,” Shields said. “Potentially stronger rainfall rates would lead to increased likelihood for localized flooding, or flash flooding. Longer durations of storms also might imply increased likelihood for overall rain within the storm itself. It doesn’t say anything about changing the mean rainfall over a given season or region. Any way you slice it, projections should be used as guidelines and not ground truth.”

Those guidelines are all pointing in the same general direction: more climate and weather problems, and more intense problems. But sorting out the differences will be crucial for developing policy responses, Ralph stressed. “The already high variability of annual precipitation in this region could become even more variable in future climate scenarios,” he said. “We don’t have a good handle on which climate projections handle ARs best in the future, and those projections differ substantially in how these events look in the future.  We need to pin this down better, to help inform policy-makers on what to expect in the future for water supply and flood risk.”

When asked what can be done to improve policy responses, Ralph replied, “A major effort is needed to improve short-term predictions of ARs, so that information could be incorporated into myriad decisions made when extreme precipitation occurs, from reservoir operations to transportation to emergency response to flood control, landslides and other impacts such as we’ve seen in California, Nevada and Oregon” over the past few weeks. “Because ARs are the key to seasonal precipitation in this region, we now know what to focus on in terms of research.”

A continent away from Washington, this is what reality-based public policy planning looks like in the age of inexorable climate change. But that doesn’t mean climate science is infallible. Last winter many forecasters predicted significant precipitation fueled by the Pacific climate cycle known as El Niño, and as Ralph puts it, that was a “bust.” At the moment, there are scientific limits on the “predictability of water in the West,” he warned. “We also have the fact that hurricanes and tornadoes attract much of the attention and funding in meteorology. It has been difficult to get adequate focus on these Western water issues.”

Ralph’s center is “creating new AR-oriented forecast tools, built upon new science,” he said. Information about this can be seen in real time on the center’s website, including a “What’s New” section that has brief examples of these products for this last series of storms. You can even sign up for automated email alerts issued daily when there are extreme precipitation events in the West (like the 52 such events mentioned above).

As Californians weather the tail end of this dramatic string of storms, it can be comforting to realize that so much is being done to advance our understanding of the climate challenges facing America’s most populous state. That understanding is starting to translate into better ways of coping with what’s to come, however challenging that future may be. The reality-based community that is mobilizing to protect California’s precarious future in the face of climate change is a model worth celebrating — and also duplicating, in as many realms of public policy as possible. Finding ways to do that that is a top priority for all of us, wherever we live.

 

The Obama administration and the legitimization of torture

obama-facing-torture-dec-10-2014

By Tom Carter
7 January 2017

On December 28, US District Judge Royce Lamberth ordered a complete copy of the Senate Intelligence Committee’s 2014 report on the Central Intelligence Agency’s torture program during the Bush administration to be delivered to a federal courthouse, where it is to be preserved in a safe by a judicial security officer. Lawyers for torture victim Abd al-Rahim Al-Nashiri requested this extraordinary measure on the grounds that efforts were underway within the other branches of the US government to destroy and erase every copy of the full report.

The Senate Select Committee on Intelligence report, titled “Committee Study of the Central Intelligence Agency’s Detention and Interrogation Program,” was finalized in December 2014 after protracted efforts by the Obama administration and the CIA to obstruct and delay the investigation. At that time, a heavily redacted 525 page “executive summary” was released to the public. The full report, which apparently numbers some 6,700 pages, has been kept secret.

The redacted copy of the executive summary establishes unequivocally that CIA personnel perpetrated war crimes, including torture and murder. Among the more depraved and sadistic torture methods exposed by the report was the practice of so-called “rectal feeding,” which involved forcibly pumping puréed food into the victim’s rectum “without evidence of medical necessity,” in the dry language of the report. These war crimes were carried out systematically with the knowledge of senior figures in the Bush administration from 2001 to 2006, and were followed by an extensive high-level cover-up. (See What is in the Senate Intelligence Committee Report on CIA torture .)

Republican Senator Richard Burr, a vocal Trump supporter, who replaced Democrat Diane Feinstein as Senate Intelligence Committee chairperson following the 2014 midterm elections, has demanded the return of every copy of the report from the Obama administration. The CIA’s copies of the report were “inadvertently” and “accidentally” destroyed by the CIA inspector general’s office in the summer of 2015.

Virtually no one has been allowed to read the full report, and a concerted effort is underway to make sure that nobody is able to read it in the future. No one has been criminally prosecuted for gross violations of international law, American statutes and the US Constitution, and the moves to make the report disappear are aimed at, in addition to censoring history, making sure that none of the criminals involved in the program are ever brought to justice.

It would be hard to find more damning evidence of the utterly rotten state of American democracy than the fate of this report.

The fact that neither the CIA nor the Bush officials who sanctioned the torture program have suffered any negative consequences, despite the presentation by the United States Senate of detailed evidence of war crimes, points to the degree to which authoritarian tendencies have asserted and entrenched themselves in the American state. The only CIA employee who has suffered negative consequences in connection with torture is analyst John Kiriakou, who was prosecuted by the Obama Justice Department and sentenced to 30 months in prison for the “crime” of revealing to the public the CIA’s use of waterboarding.

In America, certain democratic rituals continue to be observed, but the reality is that the military, the intelligence agencies, and the largest business and financial institutions dictate policy to what would once have been called the “civilian” branches of government, including both official political parties.

In order to shield the CIA from accountability for its crimes, President Obama has refused to declassify the report, which he has the power to order unilaterally. He has also refused to incorporate the report into the records of federal agencies, as requested by several lawmakers, a procedural maneuver that would facilitate its preservation and future declassification. In a token measure, he has ordered a copy retained in his official presidential records. This might save a single copy from destruction for the moment, but it would delay its release to the public until at least 2028.

The case of Al-Nashiri was the subject of particular scrutiny in the Senate investigation. Before being transferred to the Guantanamo Bay torture camp, Al-Nashiri was abducted and “rendered” to a series of secret CIA “black sites” in Afghanistan, Thailand, Poland, Morocco and Romania. He was waterboarded, shackled naked and hooded, and threatened with guns and power drills, among other abuses. To prevent evidence of war crimes from coming to light, the CIA destroyed the tapes of Al-Nashiri’s waterboarding in 2005.

Al-Nashiri, a Saudi citizen and alleged Al Qaeda leader, is the subject of ongoing proceedings before a US military commission, in which the Obama administration is seeking the death penalty. Underscoring the absurd character of these supposedly “legal” proceedings, Al-Nashiri will not be released even if he is found to be not guilty.

In July 2014, the European Court of Human Rights found Poland to be in violation of the European Convention on Human Rights for its complicity in the detention and torture of Al-Nashiri, ordering Poland to pay him €100,000 in damages.

The story of the CIA “enhanced interrogation” program is one of crimes compounded by crimes, lies told upon lies, implicating higher and higher levels of the state, eventually metastasizing into a full-blown constitutional crisis. A full recitation of the scandal’s long development would require several books.

During the Senate Intelligence Committee investigation, CIA Director John Brennan ordered agents to break into Senate staffers’ computers in an effort to delete incriminating information. Then the CIA provocatively demanded that the staffers be prosecuted for stealing confidential information, which prompted reciprocal demands for the CIA burglars themselves to be prosecuted, as well as an extraordinary speech on the Senate floor by Feinstein in March 2014. (See Senate Intelligence head accuses CIA of undermining US “constitutional framework” ) During the Senate investigation, the CIA took the position that it could keep information secret from the Senate Intelligence Committee, which is charged with overseeing the CIA.

The Obama administration purported to resolve the crisis with the announcement that nobody on either side would be prosecuted. At the end of 2014, the administration colluded with congressional Republicans in an effort to block the release of the report until the Republicans could obtain control of both houses. As part of these efforts, Secretary of State John Kerry placed a call to Democratic Senator Dianne Feinstein to urge her to “consider” further delaying the release of the report.

After the report’s release, the Obama administration continued to do everything it could to block and suppress the report. While Obama made “transparency” a plank of his election platform, his administration vigorously opposed efforts to secure the release of the report under the Freedom of Information Act, as in the case of ACLU v. CIA .

The administration’s efforts to cover up torture and shield torturers from accountability have been a significant factor in the legitimization of torture in the US, paving the way for an escalation of the practice under Trump.

A particularly menacing article appeared in the Wall Street Journal last month titled, “Sorry, Mad Dog, Waterboarding Works.” The author, James E. Mitchell, boasts of having been “authorized” to conduct “enhanced interrogation” by the CIA. He proudly describes having “personally waterboarded” three men while working as a CIA contractor.

The Senate report identifies Mitchell as one of the chief architects of the torture program. In at least one lawsuit, he has been charged with engaging in a “joint criminal enterprise” with the US government that involved “torture; cruel, inhuman, and degrading treatment; non-consensual human experimentation; and war crimes.”

In the article, Mitchell lashes out at the Senate report, calling it “partisan,” and he denounces retired Marine Corps General James “Mad Dog” Mathis, Trump’s appointee for secretary of defense, for his pragmatic statements that torture does not work. Mitchell argues that “harsh” interrogation methods are justified in a “ticking-time-bomb scenario.” However, underscoring the fraud of that oft-cited argument, Mitchell does not allege that any time bombs were ticking when he tortured his victims.

Mitchell is a war criminal and he should have been arrested and prosecuted a long time ago. The fact that he can openly boast of his conduct in the press is the product of the Obama administration’s dogged efforts to cover up torture and shield perpetrators such as Mitchell from accountability. If all of the Bush-era torturers had been sentenced to lengthy jail terms—together with those who authorized the program, lied about it, and tried to cover it up—it goes without saying that conditions would not be as favorable for such an article to appear in the Wall Street Journal, or for Trump to shout about how he will “bring back a hell of a lot worse than waterboarding.”

The fact that the Senate report on CIA torture is now in danger of being destroyed or locked away for more than a decade is a fitting symbol of the legacy of the Obama administration. Obama was elected based on popular illusions that he would reverse the hated policies of the Bush administration. Instead, by any objective standard, the Obama administration was among the most reactionary in American history. Over a period of eight years, Obama oversaw a broad assault on basic democratic rights, the strengthening of the apparatus of a police state, and a massive transfer of wealth to the super-rich. These policies helped lay the foundations for the rise of an authoritarian populist like Trump.

Without presenting a complete list, the Obama administration’s legacy as it pertains to democratic rights includes carrying out and justifying assassinations of US citizens, codifying military commissions and indefinite detention without judicial due process, persecuting whistleblowers and journalists, further expanding the illegal regime of domestic spying, blocking efforts at transparency, deporting immigrants en masse, cracking down on protests and prosecuting political activists on the basis of anti-terror laws, abetting the epidemic of police brutality, further militarizing local police, and asserting immunity on behalf of killer cops in proceedings before the Supreme Court.

When Obama first took office, he made it a priority to shield Bush-era criminals. Bush administration officials, war criminals who carried out torture, and Wall Street financial criminals who crashed the economy all got a free pass under Obama, who pledged to “look forward, not backward.” While he was elected on promises to close the infamous Guantanamo Bay facility, Obama ends his eight years in office with the torture camp still in operation.

Trump, for his part, has declared that “torture works,” and has promised to keep Guantanamo open. “I watched President Obama talking about Gitmo, right, Guantanamo Bay, which by the way, which by the way, we are keeping open,” Trump declared in November. “Which we are keeping open … and we’re gonna load it up with some bad dudes, believe me, we’re gonna load it up.”

Given the boundless sadism and depravity of the CIA torture that has already been exposed, the mind boggles at Trump’s proposal to implement practices that are “a hell of a lot worse.” Trump has also declared that American citizens who are accused of “terrorism” can be transferred to Guantanamo Bay.

WSWS

47% of Jobs Will Disappear in the next 25 Years

Article Image
British Musicians. Ms. Dynamite. Getty Images.

The Trump campaign ran on bringing jobs back to American shores, although mechanization has been the biggest reason for manufacturing jobs’ disappearance. Similar losses have led to populist movements in several other countries. But instead of a pro-job growth future, economists across the board predict further losses as AI, robotics, and other technologies continue to be ushered in. What is up for debate is how quickly this is likely to occur.

Now, an expert at the Wharton School of Business at the University of Pennsylvania is ringing the alarm bells. According to Art Bilger, venture capitalist and board member at the business school, all the developed nations on earth will see job loss rates of up to 47% within the next 25 years, according to a recent Oxford study. “No government is prepared,”The Economist reports. These include blue and white collar jobs. So far, the loss has been restricted to the blue collar variety, particularly in manufacturing.

To combat “structural unemployment” and the terrible blow it is bound to deal the American people, Bilger has formed a nonprofit called Working Nation, whose mission it is to warn the public and to help make plans to safeguard them from this worrisome trend. Not only is the entire concept of employment about to change in a dramatic fashion, the trend is irreversible. The venture capitalist called on corporations, academia, government, and nonprofits to cooperate in modernizing our workforce.

To be clear, mechanization has always cost us jobs. The mechanical loom for instance put weavers out of business. But it’s also created jobs. Mechanics had to keep the machines going, machinists had to make parts for them, and workers had to attend to them, and so on. A lot of times those in one profession could pivot to another. At the beginning of the 20thcentury for instance, automobiles were putting blacksmiths out of business. Who needed horseshoes anymore? But they soon became mechanics. And who was better suited?

A Toyota plant, Japan. Manufacturing is almost fully automated today and so many other jobs are not far behind.

Not so with this new trend. Unemployment today is significant in most developed nations and it’s only going to get worse. By 2034, just a few decades, mid-level jobs will be by and large obsolete. So far the benefits have only gone to the ultra-wealthy, the top 1%. This coming technological revolution is set to wipe out what looks to be the entire middle class. Not only will computers be able to perform tasks more cheaply than people, they’ll be more efficient too.

Accountants, doctors, lawyers, teachers, bureaucrats, and financial analysts beware: your jobs are not safe. According to The Economist, computers will be able to analyze and compare reams of data to make financial decisions or medical ones. There will be less of a chance of fraud or misdiagnosis, and the process will be more efficient. Not only are these folks in trouble, such a trend is likely to freeze salaries for those who remain employed, while income gaps only increase in size. You can imagine what this will do to politics and social stability.

Mechanization and computerization cannot cease. You can’t put the genie back in the bottle. And everyone must have it, eventually. The mindset is this: other countries would use such technology to gain a competitive advantage and therefore we must adopt it. Eventually, new tech startups and other business might absorb those who have been displaced. But the pace is sure to move far too slowly to avoid a major catastrophe.

According to Bilger, the problem has been going on for a long time. Take into account the longevity we are enjoying nowadays and the US’s broken education system and the problem is compounded. One proposed solution is a universal basic income doled out by the government, a sort of baseline one would receive for survival. After that, re-education programs could help people find new pursuits. Others would want to start businesses or take part in creative enterprises. It could even be a time of the flowering of humanity, when instead of chasing the almighty dollar, people would able to pursue their true passions.

On a recent radio program, Bilger talked about retooling the education system in its entirety, including adding classes that are sure to transfer into the skills workers need for the jobs that will be there. He also discussed the need to retrain middle-aged workers so that they can participate in the economy, rather than be left behind. Bilger said that “projects are being developed for that.” Though he admits that many middle-aged workers are resistant to reentering the classroom, Bilger says it’s necessary. What’s more, they are looking at ways of making the classroom experience more dynamic, such as using augmented reality for retraining purposes, as well as to reinvent K-12 education. But such plans are in the seminal stages.

Widespread internships and apprenticeships are also on the agenda. Today, the problem, as some contend, is not that there aren’t enough jobs, but that there aren’t enough skilled workers to fill the positions that are available. Bilger seems to think that this problem will only grow more substantial.

But would those who drive for a living, say long haul truckers and cab drivers, really find a place in the new economy with retraining, once self-driving vehicles become pervasive? No one really knows. Like any major shift in society, there are likely to be winners and losers. This pivot point contains the seeds for a pragmatic utopia, or complete social upheaval, but is likely to fall somewhere between.

Bilger ended the interview saying, “What would our society be like with 25%, 30% or 35% unemployment? … I don’t know how you afford that, but even if you could afford it, there’s still the question of, what do people do with themselves? Having a purpose in life is, I think, an important piece of the stability of a society.”

 

 

Americans’ Political ‘Psychotic Break’

People have discovered that their sense of self is in some significant way false.

Photo Credit: a katz / Shutterstock.com

Stripped of the false realities of democracy, legitimate media authorities, and American exceptionalism, U.S. society is having a “psychotic break” of sorts. What many Americans have previously believed to be “reality” is disintegrating.

Science provides us with no monolithic explanation for what is commonly called a psychotic break, but for some people who have lived this experience, they describe their sense of who they’ve believed themselves to be as disintegrating in a massive way, a discovery that their sense of self is in some way false. This experience can be overwhelming, emotionally and cognitively, and can propel them into an altered state.

Every so often, the American societal-political veil lifts, and what was clear to George Carlin and other cynical nonvoters is difficult to deny even for voters skilled at denial. In the 2016 presidential selection/election process, the veil lifted, making it difficult even for previously trusting Americans to continue to believe that they lived in a democracy that provides them with a choice and a say, and made it difficult to continue to believe in the legitimacy of mainstream media. Even for those skilled in denial, it has become difficult to believe in the American exceptionalism that their nation is immune from what other nations are not immune from: a con man taking power by exploiting a sense of victimization—a reality that is now difficult to deny even for a growing number of betrayed Trump voters.

Before getting to the disintegration for Sanders and Trump supporters, first some of the unsettling blows that have made it difficult for millions of Americans in general to deny that they had a false sense of their societal-political reality:

* The chronic tension of the lesser-of-two evils choice, which had previously produced “democracy dissonance” for some Americans, was ratcheted up considerably in 2016, expanding the number of Americans incredulous that they lived in a democracy that provided choice and say. In 2016, the dislike for the Republican and Democrat candidates reached historic unfavorable levels, with Trump’s July 2016 unfavorable polls average at 57% (favorable at 36%) and Clinton’s July 2016 unfavorable polls average at 56% (favorable at 38%). Perhaps unfavorable levels for both candidates need to hit 70% for that variable alone to result in a psychotic break, but this wasn’t the only assault on many Americans’ “societal-political sense of self.”

* The “loser” of the presidential election received nearly 3 million more votes than the “winner.” Of course, for some Americans who believe they live in a democracy that provides them with a choice and a say, perhaps the “loser” needs to have to have 10 million more votes than the “winner” for their psychotic break.

* The day before the 2016 election, mainstream media were close to certain that Hillary Clinton would win (New York Times 85%; CNN, 91%; Huffington Post, 98%). This prognostication failure has further shattered what was left of trust for mainstream media authorities.

While science provides us with no monolithic explanation for a psychotic break, one trigger explanation that resonates with some people who have had this experience is a horrific “double-bind.” An example of such a double bind: A sexually abused child is told by the family member abuser: “It’s now too late for you to escape, because I will deny it and nobody will believe you”; and the child is left with the choice between continuing the sexual abuse or denouncing the abuser, an action the child believes will result in the child being accused by family of lying and/or the child held responsible for destroying the family. And so this unresolvable dilemma overwhelms the child.

The lesser-of-two-evils choice—especially when the two choices are both extremely evil—is a double bind of sorts.

Bernie Sanders put his 12 million primary voters and other supporters in a double-bind. For Sanders supporters, Hillary Clinton epitomized what they despised. Clinton has been: heavily supported by Wall Street and arms dealers; repeatedly pro-war from Iraq to Libya; a friend and admirer of Henry Kissinger, who for Sanders supporters is one of the greatest war criminals in world history; a former board member of the anti-labor union Wal-Mart Board of Directors; a co-sponsor of the Flag-Protection Act of 2005, which included prison terms for those who destroy the flag; and has had an otherwise despicable and untrustworthy history for progressives.

Bernie Sanders’ choice was to either support someone that his supporters despise and distrust or don’t support Clinton and Trump wins, and the Democratic Party and its media operatives politically assassinate Sanders as was done with Ralph Nader post-2000 election. Sanders’ public reaction was to choose what he had many reasons to believe was a false reality—that Clinton was not going to betray her new-found progressivism. Given Clinton’s history, Sanders had good reason to believe that Clinton as president would likely betray campaign progressive promises and simply blame failure on the Republicans. But rather than choosing Nader’s path, Sanders suppressed the reality of Clinton, and asked his supporters to do the same.

Many Sanders supporters could not shed the reality of Hillary Clinton’s anti-progressive history and that the Democratic Party establishment had sabotaged Sanders (who the polls had shown had a much better chance than Clinton of beating Trump); and these supporters lost faith in both Sanders and the electoral process and did not vote—a political-self psychotic break of sorts for people who had ardently believed in voting.

Other Sanders supporters followed Sanders’ direction and voted for Clinton, only to find themselves now assaulted by the reality that Sanders had instructed them to support a corrupt political process that resulted in Trump winning anyway.

How about Trump supporters? Millions of Trump supporters, even before his inauguration, began having their political-self psychotic break, recognizing that they had been “played,” that Trump had no intention of keeping his campaign promises, and used them to gain power and attention.

A major issue for Trump supporters was “crony capitalism,” but even before Trump was inaugurated, he orchestrated the Carrier deal of tax breaks for jobs, which was so obviously a betrayal that even Sara Palin decried it calling it “crony capitalism.”

That has not been Trump’s only pre-inauguration betrayal.

Trump repeatedly promised to “drain the swamp.” The epitome of the “swamp” is the revolving door between the U.S. government and Goldman Sachs, yet Trump’s nominees for his administration include former Goldman Sachs employees Steve Mnuchin for Treasury Secretary and Gary Cohn for National Economic Council. It’s now become increasingly clear that Trump appears to be well on his way to creating the most putrid swamp ever, as he nominated for cabinet positions six of his top donors, as well as several establishment politicians (for example, Senate Majority Leader Mitch McConnell’s wife Elaine Chao as Transportation Secretary; 20-year U.S. Senator Jeff Sessions for Attorney General, and others).

Trump promised his supporters an “anti-politician,” and they received a caricature of a politician who didn’t even wait until he was inaugurated to betray his promises.

For the patriotic “Make America Great Again” Trump supporters who have believed their entire lives that Russia is America’s enemy and that the CIA protects Americans from “commies and terrorists,” what do they do with the reality that Trump has made war on the CIA and has befriended Russia, who the CIA reports actively worked for Trump’s election?

From Trump’s history, it is not likely that these mind-blowing assaults on what was once considered “conservatism” and “patriotism” will end.

Among Trump’s approximately 63 million voters, some now claim that the most passionate rallying cry of every Trump rally—“lock her up”—was just theater, and that they are unbothered that almost immediately after the election, Trump stated that he is not going to prosecute Clinton. Their focus is only on the financial promises that Trump—who they believe to be a “warrior businessman”—will grow the GDP at a fantastic rate, and that this along with deregulation and corporate tax breaks will result in a return of high-paying jobs. For this group, the future holds another likely shock. Should corporations accrue more cash, recent history has shown us that they do not hire more workers at higher salaries but instead spend this cash on stock buybacks.

What happens post-psychotic break?

The individual psychotic break and resulting altered state, from the outside, is a frightening frenzy of beliefs, speech, and behavior that makes no sense. But to those experiencing it, there can be an array of new ideas—some which they ultimately reject as delusional (e.g., no, they can’t fly) but some not delusional (e.g., yes, they have been traumatized by authorities who have lied to them).

On an individual level, psychotic breaks routinely go two ways. If one is lucky and has support, one can emerge from this altered state with greater clarity of one’s true self. But if one is unlucky and fear and unsafety sabotages this process, one can become permanently labeled as “seriously mentally ill.”

So, with America’s societal-political psychotic break, it is quite possible that a few more million people will emerge with George Carlin-like clarity about the truth of the American sham democratic political system—a truth borne out by studies such as “Testing Theories of American Politics: Elites, Interest Groups, and Average Citizens” (see video of findings) that validate Carlin’s observation that no matter whether Republicans or Democrats in charge, average Americans have no fucking influence on government policy.

Or, this societal-political “psychotic break” can result in further deterioration, further “social-political illness,” transforming the United States from “friendly fascism” and bullshit hypocrisy about democracy to violent, boot-in-your-face fascism where truth tellers in the tradition of George Carlin are driven underground, way underground.

 

How Social Isolation Is Killing Us

My patient and I both knew he was dying.

Not the long kind of dying that stretches on for months or years. He would die today. Maybe tomorrow. And if not tomorrow, the next day. Was there someone I should call? Someone he wanted to see?

Not a one, he told me. No immediate family. No close friends. He had a niece down South, maybe, but they hadn’t spoken in years.

For me, the sadness of his death was surpassed only by the sadness of his solitude. I wondered whether his isolation was a driving force of his premature death, not just an unhappy circumstance.

Every day I see variations at both the beginning and end of life: a young man abandoned by friends as he struggles with opioid addiction; an older woman getting by on tea and toast, living in filth, no longer able to clean her cluttered apartment. In these moments, it seems the only thing worse than suffering a serious illness is suffering it alone.

Social isolation is a growing epidemic — one that’s increasingly recognized as having dire physical, mental and emotional consequences. Since the 1980s, the percentage of American adults who say they’re lonely has doubled from 20 percent to 40 percent.

About one-third of Americans older than 65 now live alone, and half of those over 85 do. People in poorer health — especially those with mood disorders like anxiety and depression — are more likely to feel lonely. Those without a college education are the least likely to have someone they can talk to about important personal matters.

A wave of new research suggests social separation is bad for us. Individuals with less social connection have disrupted sleep patterns, altered immune systems, more inflammation and higher levels of stress hormones. One recent study found that isolation increases the risk of heart disease by 29 percent and stroke by 32 percent.

Another analysis that pooled data from 70 studies and 3.4 million people found that socially isolated individuals had a 30 percent higher risk of dying in the next seven years, and that this effect was largest in middle age.

Loneliness can accelerate social connection  in older adults, and isolated individuals are twice as likely to die prematurely as those with more robust social interactions. These effects start early: Socially isolated children have significantly poorer health 20 years later, even after controlling for other factors. All told, loneliness is as important a risk factor for early death as obesity and smoking.

The evidence on social isolation is clear. What to do about it is less so.

Loneliness is an especially tricky problem because accepting and declaring our loneliness carries profound stigma. Admitting we’re lonely can feel as if we’re admitting we’ve failed in life’s most fundamental domains: belonging, love, attachment. It attacks our basic instincts to save face, and makes it hard to ask for help.

I see this most acutely during the holidays when I care for hospitalized patients, some connected to I.V. poles in barren rooms devoid of family or friends — their aloneness amplified by cheerful Christmas movies playing on wall-mounted televisions. And hospitalized or not, many people report feeling lonelier, more depressed and less satisfied with life during the holiday season.

New research suggests that loneliness is not necessarily the result of poor social skills or lack of social support, but can be caused in part by unusual sensitivity to social cues. Lonely people are more likely to perceive ambiguous social cues negatively, and enter a self-preservation mind-set — worsening the problem. In this way, loneliness can be contagious: When one person becomes lonely, he withdraws from his social circle and causes others to do the same.

Dr. John Cacioppo, a psychology professor at the University of Chicago, has tested various approaches to treat loneliness. His work has found that the most effective interventions focus on addressing “maladaptive social cognition” — that is, helping people re-examine how they interact with others and perceive social cues. He is collaborating with the United States military to explore how social cognition training can help soldiers feel less isolated while deployed and after returning home.

The loneliness of older adults has different roots — often resulting from family members moving away and close friends passing away. As one senior put it, “Your world dies before you do.”

Ideally, experts say, neighborhoods and communities would keep an eye out for such older people and take steps to reduce social isolation. Ensuring they have easy access to transportation, through discounted bus passes or special transport services, can help maintain social connections.

Religious older people should be encouraged to continue regular attendance at services and may benefit from a sense of spirituality and community, as well as the watchful eye of fellow churchgoers. Those capable of caring for an animal might enjoy the companionship of a pet. And loved ones living far away from a parent or grandparent could ask a neighbor to check in periodically.

But more structured programs are arising, too. For example, Dr. Paul Tang of the Palo Alto Medical Foundation started a program called linkAges, a cross-generational service exchange inspired by the idea that everyone has something to offer.

The program works by allowing members to post online something they want help with: guitar lessons, a Scrabble partner, a ride to the doctor’s office. Others can then volunteer their time and skills to fill these needs and “bank” hours for when they need something themselves.

“In America, you almost need an excuse for knocking on a neighbor’s door,” Dr. Tang told me. “We want to break down those barriers.”

For example, a college student might see a post from an older man who needs help gardening. She helps him plant a row of flowers and “banks” two hours in the process. A few months later, when she wants to cook a Malaysian meal for her boyfriend, a retired chef comes by to give her cooking lessons.

“You don’t need a playmate every day,” Dr. Tang said. “But knowing you’re valued and a contributing member of society is incredibly reaffirming.”

The program now has hundreds of members in California and plans to expand to other areas of the country.

“We in the medical community have to ask ourselves: Are we controlling blood pressure or improving health and well-being?” Dr. Tang said. “I think you have to do the latter to do the former.”

A great paradox of our hyper-connected digital age is that we seem to be drifting apart. Increasingly, however, research confirms our deepest intuition: Human connection lies at the heart of human well-being. It’s up to all of us — doctors, patients, neighborhoods and communities — to maintain bonds where they’re fading, and create ones where they haven’t existed.

Correction: December 24, 2016
An Upshot article on Thursday about the health risks of social isolation misstated the purpose of a grant by the Robert Wood Johnson Foundation to a program, linkAges, dedicated to fighting the problem. The grant to linkAges was for testing a new project connected to the program; it was not meant to help linkAges expand across other areas of the country.

Saving art from looting and destruction — especially in the Middle East — is a military matter

A few good Monuments Men:

The British Army is recruiting experts who fancy themselves George Clooney 2.0 to preserve global cultural treasure

A few good Monuments Men: Saving art from looting and destruction — especially in the Middle East — is a military matter

(Credit: Columbia Pictures)

The British Army recently announced that it would be recruiting 15 to 20 new officers with specializations in art, archaeology and antiquities who will be deployed in the field, just behind the front lines, to help identify, protect and track art and antiquities that are in danger of being damaged, looted or destroyed.

This is, of course, particularly relevant to Middle Eastern conflicts, where groups like ISIS have shown a giddy eagerness to destroy ancient monuments, on the scale of whole cities like Palmyra and Nimrud, as well as individual pre-Islamic statuary that are deemed heretical. Of course, the flip side to this is that these groups are also happy to profit off the very objects they condemn, and they are funding their activities through illicit trade in antiquities.

These works of great artistry and historical importance — which cannot exactly be blamed for not being Islamic artworks, if they were created a millennium before the Prophet Muhammad was born — can be saved from the mallet if they are destined for the auctioneer’s hammer.

Not long ago, scholars like me (I’m a specialist in the history of art crime) had to work hard to convince not just the general public but also authorities, police and politicians that art crime, particularly illicit trade in antiquities, funded organized crime and terrorism. No longer. This stance has been vindicated, unfortunately, in scores of destructive ways, most obviously through ISIS videos of iconoclasm. The only remaining questions concern the scale of the earnings from looted antiquities and what to do to stem the flow.

The most direct way to curb the looting is to discourage First World buyers from purchasing anything that is not 100 percent clearly not from recent excavations in Middle Eastern conflict zones. But while those in the art trade talk a good game, there’s profit to be had, from major galleries and online auction sites (where it is easy for a seller to hide his or her identity, difficult to be sure of an object’s provenance, and where some objects have been advertised as being still covered in desert sand, as if this were a selling point). The documentary “Blood Antiques” chillingly shows how certain Brussels art dealers, for instance, collude with actors posing as people with looted antiquities to sell, even as some still have desert sand on them.

So the need is clear to help protect surviving monuments from iconoclasts and do what we can to limit the funds for fundamentalist groups. Curbing art crime is one way to do that. The U.K. and France are among the governments that have recently dedicated tens of millions to protecting cultural heritage. The National Endowment for the Humanities in the United States recently launched a new grant for projects with that same goal. So it stands to reason that the military would reinstate officers who might be described as modern-day Monuments Men.

***

It is fitting that the Monuments Men 2.0 should be spearheaded by the British because they were the masterminds behind the original incarnation. The highest-profile among the officers were Americans, promoted by recent books like “Monuments Men,” which focused on George Stout (whose fictional avatar was played by Clooney in his film of the same name) and my own “Stealing the Mystic Lamb,” which focused on Monuments Men Robert Posey and Lincoln Kirstein. (Fictionalized versions of the latter were played by Bill Murray and Bob Balaban in the “Monuments Men”  film.

But the core of the program was British, led by Sir Leonard Woolley, a rumpled, opinionated archaeologist, a buddy of Agatha Christie and someone in desperate need of a rollicking biography, preferably penned by the master of World War II intrigue romps, Ben MacIntyre. Woolley’s project has its origins in January 1943, when archaeologists Mortimer Wheeler and John Ward-Perkins visited archaeological sites like Leptis Magna in North Africa.

A team of Italian archaeologists had recently excavated the site, but the excavated objects were just left there unprotected. British soldiers were inadvertently damaging ruins without realizing that the stones were of cultural or historical importance. The two prepared homemade “out of bounds” signs to show people where to avoid treading.

Thus began a movement to educate soldiers in the field, for with knowledge of the treasures they might encounter came consideration to protect them. What Wheeler conceived was later enacted by Woolley, as part of the British War Office. But Woolley envisioned this as an advisory role not one that would actually send officers into war zones. That would be the American contribution.

Meanwhile, in the United States at the outset of World War II, a group of American museum leaders, including Paul Sachs of Harvard’s Fogg Museum and Alfred Barr of the Museum of Modern Art, drew up a list of cultural heritage objects that might be in danger in the course of the fighting on continental Europe. This list was linked to maps that were distributed to officers, along with General Dwight Eisenhower’s important directive to avoid damaging cultural sites whenever possible. There is no record of a previous general making such a statement.

Noah Charney is a Salon arts columnist and professor specializing in art crime, and author of “The Art of Forgery” (Phaidon).