Eight billionaires control as much wealth as the bottom half of the world’s population

Oxfam issues report on eve of Davos conference

davos-meeting-inequality

17 January 2016

Eight billionaires, six of them from the United States, own as much combined wealth as the bottom half of the world’s population, some 3.6 billion people, according to the latest report on global inequality from the British-based advocacy group Oxfam.

The report was released Monday, on the eve of the annual World Economic Forum in the mountain resort of Davos, Switzerland, at which many of the ultra-rich will converge this week. The Oxfam document contains a range of figures that highlight the staggering growth of social inequality, showing that the income and wealth gap between a tiny financial elite and the rest of the world’s people is widening at an accelerating rate.

New data made available to Oxfam reveals that wealth is even more concentrated than the organization had previously believed. Last year, Oxfam reported that 62 people controlled as much wealth as the bottom half of humanity. In its latest report, the charity notes that “had this new data been available last year, it would have shown that nine billionaires owned the same wealth as the poorest half of the planet.”

Oxfam writes that since 2015, the richest 1 percent of the world’s population has owned more than the rest of the world put together, and that over the past quarter century, the top 1 percent has gained more income than the bottom 50 percent combined.

“Far from trickling down, income and wealth are being sucked upwards at an alarming rate,” the report states. It notes that the 1,810 dollar billionaires on the Forbes 2016 rich list own $6.5 trillion, “as much wealth as the bottom 70 percent of humanity.”

Over the next 20 years, some 500 people will hand over to their heirs more than $2.1 trillion, an amount larger than the gross domestic product of India, a country of 1.3 billion people.

Oxfam cites recent research by the economist Thomas Piketty and others showing that in the United States, over the past 30 years the growth in incomes of the bottom 50 percent has been zero, while the incomes of the top 1 percent have risen by 300 percent.

The same process is taking place in the world’s poorest countries. Oxfam notes that Vietnam’s richest man earns more in a day than the country’s poorest person earns in 10 years.

The report points to the systematic character of the siphoning of global wealth to the heights of society. The business sector is focused on delivering “ever higher returns to wealthy owners and top executives,” with companies “structured to dodge taxes, drive down workers’ wages and squeeze producers.”

This involves the most barbaric and criminal practices. Oxfam cites a report by the International Labour Organisation estimating that 21 million people are forced labourers, generating $150 billion in profits every year. The world’s largest garment companies all have links to cotton-spinning mills in India that routinely use the forced labour of girls.

Small farmers are also being driven into poverty: in the 1980s, cocoa farmers received 18 percent of the value of a chocolate bar, compared to just 6 percent today.

The extent of corporate power is highlighted in a number of telling statistics. In terms of revenue, 69 of the world’s largest economic entities are now corporations, not countries. The world’s 10 largest companies, including firms such as Wal-Mart, Shell and Apple, have combined revenue greater than the total government revenue of 180 countries.

Although the authors avoid any condemnation of the profit system per se, the information provided in their report amounts to a stunning verdict on the capitalist system. It highlights in facts and figures two central processes delineated by Karl Marx, the founder of modern socialism.

In Capital, Marx explains that the objective logic of the capitalist system, based on the drive for profit, is to produce ever greater wealth at one pole and poverty, misery and degradation at the other. In the Communist Manifesto, he explains that all governments are but the executive committee for managing the affairs of the capitalist class.

This is exemplified in the tax policies and other “business-friendly” measures undertaken by governments around the world. The Oxfam report notes that technology giant Apple is alleged to have paid a tax of just 0.005 percent on its European profits.

Developing countries lose around $100 billion a year as a result of outright tax dodging and the exemptions granted to companies. In Kenya, $1.1 billion is lost to government revenue every year because of exemptions, an amount nearly twice the country’s annual health budget.

Government tax policies work hand in hand with tax dodging and criminality. The report cites economist Gabriel Zucman’s estimate that $7.6 trillion of global wealth is hidden in offshore tax havens. Africa alone loses $14 billion in annual revenues because of the use of tax havens: enough to pay for health care that would save the lives of four million children and employ enough teachers to ensure that every African child went to school.

There is one significant omission from Oxfam’s discussion of accelerating inequality. It makes no mention of the critical role of the policies of the world’s major governments and central banks in handing over trillions of dollars to the banks, major corporations and financial elites through bank bailouts and the policies of “quantitative easing” since the eruption of the global financial crisis in 2008.

A discussion of these facts would raise uncomfortable political issues. The report opens by favourably citing remarks by US President Barack Obama to the UN General Assembly in 2016 that a world in which 1 percent of the population owns as much as the other 99 percent can never be stable.

But the very policies of the Obama administration have played a key role in creating this world. After rescuing the financial oligarchs from the results of their own criminal actions with massive bank bailouts, the Obama administration and the US central bank ensured their further enrichment by providing a supply of ultra-cheap money that boosted the value of their assets.

Under Obama, the decades-long growth of inequality accelerated, along with the descent of the ruling class into parasitism and criminality. He paved the way for the financial oligarchy to directly seize the reins of power, embodied in the imminent presidency of casino and real estate billionaire Donald Trump, to whom Obama will hand over the keys to the White House on Friday.

The overriding motivation behind the Oxfam report is fear of the political consequences of ever-rising inequality and a desire to deflect mounting anger over its consequences into harmless channels. It advances the perspective of a “human economy,” but maintains that this can be achieved on the basis of the capitalist market, provided corporations and governments change their mindsets.

The absurdity of this perspective, based on the long-discredited outlook of British Fabianism, which has dominated the thinking of the English middle classes for well over a century, can be seen from the fact that the report is directed to the global financial elites gathered at the Davos summit this week, with a call for them to change their ways.

The bankruptcy of this outlook is demonstrated not only by present-day facts and figures, but by historical experience. A quarter century ago, following the liquidation of the Soviet Union, the air was filled with capitalist triumphalism. Freed from the encumbrance of the USSR, and able to dominate the globe, liberal capitalist democracy was going to show humanity what it could do.

And it certainly has, creating a world marked by ever-rising inequality, the accumulation of wealth to truly obscene levels, oppression and anti-democratic forms of rule, criminality at the very heights of society, and the increasingly ominous prospect of a third world war.

This history brings into focus another anniversary: the centenary of the Russian Revolution. Despite its subsequent betrayal at the hands of the Stalinist bureaucracy, the Russian Revolution demonstrated imperishably, and for all time, that a world beyond capitalism and all its social ills and malignancies is both possible and necessary. Its lessons must inform the guiding perspective for the immense social struggles that are going to erupt out of the social conditions detailed in the Oxfam report.

Nick Beams

WSWS 

Climate change brings a potentially devastating “atmospheric river” to California

Megastorms vs. megadroughts:

After years of drought, the Golden State is hit by epic storms — and it’s just the beginning of climate chaos

Megastorms vs. megadroughts: Climate change brings a potentially devastating "atmospheric river" to California
Michelle Wolfe, who had to evacuate her nearby mobile home, looks out toward flooded vineyards in the Russian River Valley, Monday, Jan. 9, 2017, in Forestville, Calif. (Credit: AP/Eric Risberg)

As the incoming Trump administration turns Washington increasingly freakish and bizarre, reinventing government as reality show, Mother Nature is doing something equally dramatic 3,000 miles away. Donald Trump can deny climate change all he wants to, but Californians can’t escape the contrasting weather extremes it’s already causing or affecting. We’re in a cycle of ever more serious droughts broken by more intense storms — harbingers of much more serious challenges to come. What’s happening in California now serves to underscore long-term realities, regardless of the day-to-day fantasies of those who temporarily hold political power.

A series of storms from the vicinity of Hawaii, known as the “Pineapple Express,” have drenched California and parts of Nevada, signaling a likely end to four years of severe drought. Just during the storm that hit Jan. 7 to 10, there were 52 reports of extreme precipitation (meaning more than eight inches of rain in a three-day period), with several measuring twice that. Strawberry Valley, on the western slopes of the Sierra Nevadas, got an amazing 20.51 inches of rain during that storm — more than Los Angeles typically gets in an entire year.

The Pineapple Express is just one example of a worldwide phenomenon known as “atmospheric rivers” or ARs. These are jet streams of moist air, tens to hundreds of miles wide, that can carry roughly 10 times as much water vapor as the Mississippi River at its mouth. Powerful as the current set of AR storms are, they pale in comparison to the month-long storms of 1861-2 that flooded much of the state, creating a 300-mile lake in the San Joaquin Valley. But even worse is possible. In 2011, the U.S. Geological Survey did a study of what a 1,000-year atmospheric river storm — known as ARkStorm — would do. Projected losses were staggering, including property losses around $400 billion (more than three Hurricane Katrinas) with another $325 billion in losses due to business interruption, lasting as long as five years. So Californians are lucky today.

View image on Twitter

The percentage of the state that is defined as “drought-free” has almost doubled overnight, from about 18 percent to 34.5 percent, according to the U.S. Drought Monitor. The drought-free area is largely in less-populated Northern California, above an east-west line running from San Francisco to Lake Tahoe, but there are broader signs of hope for the whole state.

“This is likely to be the end of the surface-storage drought for most of the state,” wrote water expert Jeffrey Mount, of the Public Policy Institute of California. With a few more days of rain, he predicted, “almost all the major reservoirs will be at or above their seasonal averages … conditions we have not seen in six years. This is great news since reservoirs are the primary source of water for cities and farms.”

Still, the good news has to be sharply qualified. Even before California’s latest drought, a much longer, continent-wide drought was underway, as shown in this panel of eight annual drought maps from the 2009 paper “Megadroughts in North America” by Edward Cook and co-authors. In a related document, they show that during the medieval period, from 1021 to 1382, the majority of the continental U.S. experienced four megadroughts lasting 22 to 40 years, interspersed with occasional isolated non-drought years. These were three to four times longer than similar modern multi-year droughts from 1855 to 1957, which ranged from seven to 10 years. Thus, California’s climate this century is already atypical for the modern era. The state may already be in the middle of a medieval-style megadrought. The state needs more than one good year of rain to begin breathing easier.

The underlying science behind these phenomena is increasingly coming into focus, according to Marty Ralph, director of the Center for Western Weather and Water Extremes at Scripps Institution of Oceanography. “It has been shown that in major parts of the West drought is due to a reduced amount of precipitation from the wettest days, many of which are AR events,” Ralph told Salon.

“We have also known for about 10 years now that most of the big flooding events in the West Coast, at least, are a result of atmospheric rivers. These findings are especially strong in the West Coast and Southwest, and in Western Europe. Thus, indeed, the future of drought and flood in this region hinges on the fate of ARs. And climate models vary substantially in how they handle this.”

What is certain is that both extreme drought and extreme AR storms, driven by global climate change, pose growing challenges to California and many other places in the decades ahead. The divergent extremes place increasing stress on the whole ecosystem, as well as its physical underpinnings. “It’s a really bad combination of two extremes,” MIT’s Adam Schlosser told Pacific Standard. “The drought dries, and, in some sense, cooks up the ground. It becomes more susceptible to heavy rain. You’re putting together a meta-event that could be quite destructive.”

Schlosser was discussing a paper to which he contributed projecting that California will experience three more extreme precipitation events per year by 2100, although the number could be reduced by half that if aggressive policy measures are pursued. These results are more dramatic, but point in the same direction, as research published last summer by Christine Shields and Jeffrey Kiehl at the National Center for Atmospheric Research in Colorado.

Shields agreed with Schlosser’s warning. “Drought-stricken areas can be significantly damaged by heavy flood,” she told Salon. Although she hadn’t yet read Schlosser’s paper, she warned against overemphasizing any differences. “The different climate projections found in the literature may be due to, in part, a difference in the way the ARs are defined and tracked,” she noted.

It’s also important to distinguish between different measures. “Intensity of rain is not the same thing as overall rain totals, or mean [average] rain,” Shields said. “Potentially stronger rainfall rates would lead to increased likelihood for localized flooding, or flash flooding. Longer durations of storms also might imply increased likelihood for overall rain within the storm itself. It doesn’t say anything about changing the mean rainfall over a given season or region. Any way you slice it, projections should be used as guidelines and not ground truth.”

Those guidelines are all pointing in the same general direction: more climate and weather problems, and more intense problems. But sorting out the differences will be crucial for developing policy responses, Ralph stressed. “The already high variability of annual precipitation in this region could become even more variable in future climate scenarios,” he said. “We don’t have a good handle on which climate projections handle ARs best in the future, and those projections differ substantially in how these events look in the future.  We need to pin this down better, to help inform policy-makers on what to expect in the future for water supply and flood risk.”

When asked what can be done to improve policy responses, Ralph replied, “A major effort is needed to improve short-term predictions of ARs, so that information could be incorporated into myriad decisions made when extreme precipitation occurs, from reservoir operations to transportation to emergency response to flood control, landslides and other impacts such as we’ve seen in California, Nevada and Oregon” over the past few weeks. “Because ARs are the key to seasonal precipitation in this region, we now know what to focus on in terms of research.”

A continent away from Washington, this is what reality-based public policy planning looks like in the age of inexorable climate change. But that doesn’t mean climate science is infallible. Last winter many forecasters predicted significant precipitation fueled by the Pacific climate cycle known as El Niño, and as Ralph puts it, that was a “bust.” At the moment, there are scientific limits on the “predictability of water in the West,” he warned. “We also have the fact that hurricanes and tornadoes attract much of the attention and funding in meteorology. It has been difficult to get adequate focus on these Western water issues.”

Ralph’s center is “creating new AR-oriented forecast tools, built upon new science,” he said. Information about this can be seen in real time on the center’s website, including a “What’s New” section that has brief examples of these products for this last series of storms. You can even sign up for automated email alerts issued daily when there are extreme precipitation events in the West (like the 52 such events mentioned above).

As Californians weather the tail end of this dramatic string of storms, it can be comforting to realize that so much is being done to advance our understanding of the climate challenges facing America’s most populous state. That understanding is starting to translate into better ways of coping with what’s to come, however challenging that future may be. The reality-based community that is mobilizing to protect California’s precarious future in the face of climate change is a model worth celebrating — and also duplicating, in as many realms of public policy as possible. Finding ways to do that that is a top priority for all of us, wherever we live.

 

Why Millennials Aren’t Afraid of Socialism

 

It’s an old idea, but the people who will make it happen are young—and tired of the unequal world they’ve inherited.

On Wednesday, November 9, at 9:47 am, BuzzFeed News sent out a push notification: “Trump is leading a global nationalist wave. The liberal world order is nearly over and the age of populism is here.” This, from a publication better known for listicles than sweeping political pronouncements. If even BuzzFeed felt it necessary to ring the death knell for the “liberal world order,” then liberalism must be really, really dead.

But what, besides global nationalism, can replace it? The answer is clear if we look at the 2016 election from its inception. The race we should be remembering is not just Clinton versus Trump, but Sanders versus Clinton. For nearly a year, millions of Americans supported an avowed socialist, and many of those people were young—like me.

This new New Left renaissance isn’t confined to the United States: Our British neighbors witnessed a similar wave of enthusiasm for Jeremy Corbyn. It’s kind of funny, if you think about it: The two most prominent politicians to galvanize young people in the United States and the United Kingdom over the last year are old white dudes. Sanders and Corbyn both look like my dad, except even older and less cool.

And it’s not just them—their ideas are old too. Or so it would seem to anyone who came of age before the fall of the Berlin Wall. Socialism, the redistribution of wealth, providing vital benefits and social services through the mechanism of the state—people were talking about this in the 1960s. And in the 1930s. And in the 19-teens. And now Sanders and Corbyn are recycling those hoary ideas (or so the argument goes), their only concession to the 21st century being the incorporation of racial-, queer-, and climate-justice rhetoric. (We can argue about how earnest they are and how successful that’s been).

And yet, in the 2016 primaries, Sanders won more votes from people under 30 than Clinton and Trump combined. Bernie pulled in more than 2 million of us; Clinton and Trump trailed far behind, with approximately 770,000 and 830,000, respectively.

Corbyn’s signature achievement thus far has been nearly tripling the size of the UK Labour Party. With over 550,000 members, it’s the largest political party in Western Europe. Though Corbyn’s supporters are not as strikingly youthful as Sanders’s—the influx of new members has barely changed the party’s average age—the youngest among them have a similar enthusiasm.

If you spent last year wondering why all these young people (“millennials,” as the headlines love to shout) have flocked to dudes even older and less cool than my dad, consider this: I’m 22. I was born in 1994. Bill Clinton was president. It was the era of the New Democrats in the United States and New Labour in the UK. Five years earlier, Francis Fukuyama had famously declared “the end of history,” and neither September 11 nor the global financial collapse had yet shaken that sense of security. My birth, and that of my generation, coincided with a huge geopolitical shift: For the first time in 50 years, the world wasn’t split in two along the familiar capitalist/communist lines of the Cold War. Seemingly, it had become whole.

George W. Bush was president for most of my childhood. My parents were Democrats in a red state, and at that point primarily defined their politics as being against the Iraq War and for same-sex marriage. Things like class, exploitation, and inequality were never mentioned, let alone a systematic way—like socialism—to think about them. I took up these anti-Republican positions with righteous gusto. In fact, I was co-president of my high school’s Young Democrats chapter, where I organized a screening of Jesus Camp and led discussions about the hypocrisy of the right’s “family values” agenda. Those were my politics.

The first president I voted for was Barack Obama, in 2012. By then, the shiny hope-and-change stuff had worn off a bit. I vaguely knew that drones were bad and that those responsible for the financial catastrophe a few years earlier had gotten off easy, but I didn’t think about it much. I was too busy binge-drinking in sweaty college basements—and hey, I’d voted for a Democrat. That was chill, right?

A child of the ’90s, I knew only neoliberalism. Socialism was brand-new.

It was during Obama’s second term that I began to understand how bad the financial crisis was and who was responsible (hint: the financial sector). Occupy Wall Street started to seem less like agenda-less rabble-rousing, as I had thought when I was co-president of the Young Democrats, and more like people confronting wealth and power in an unprecedented—and incisive—way. Thomas Piketty published his neo-Marxist tome, and its introduction alone fundamentally changed the way I understood economics. There was that viral video, based on a 2011 academic study of Americans’ perceptions of inequality, that used stacks of money to illustrate the wealth gap in United States. I must’ve seen it 30 times.

Four years later, as I finished college, Bernie Sanders shuffled onto the national political stage and offered an analysis: Poverty isn’t a natural phenomenon; it exists because a few people own far more than their fair share. He also offered a solution: The government could act on behalf of those of us just barely treading water. The government’s role, Sanders argued, is to correct the rampant inequality in this country by taxing the rich and using that money to offer real social services.

The erasure of socialist ideas from serious political discourse throughout most of my life wasn’t a historical fluke. The West’s victory in the Cold War—liberal democracy for everyone!—came at the price of iconoclasm, much of it celebratory. In Prague, there used to be a giant socialist-realist statue of Stalin and other communist leaders standing in a line on a hill overlooking the city from the north. Czechs called it the “meat line,” a joke about the long lines they had to wait in to get groceries. Now kids skateboard on the platform where the dictator once kept watch. To visit Prague now—or Budapest, or Sofia, or Bucharest, or Berlin—you might think that communism never happened. All that’s left are a few tacky museums and somber monuments.

So communism was killed, and along with it went any discussion of socialism and Marxism. This was the world of my childhood and adolescence, full of establishment progressives who were aggressively centrist and just as willing as conservatives to privilege the interests of capital over those of labor: think of the reckless expansion of so-called free trade, or the brutal military-industrial complex. For most of my life, I would have been hard-pressed to define capitalism, because in the news and in my textbooks, no other ways of organizing an economy were even acknowledged. I didn’t know that there could be an alternative.

It occurred to me recently that my peers and I will come of age in the era of Trump. It’s a bleak generational landmark, and not one I anticipated, but ideological capitulation and despair are not the answer. In the 1930s and 1940s, many of the most dedicated antifascists were communists. The antidote to radical exploitation and exclusion is radical egalitarianism and inclusion.

So we will be the opposition—but we’re not starting from scratch. The Fight for $15, organized in part by Socialist Alternative, went from a fringe dream to a political reality that has thus far spread to at least 10 cities and two states. Heterodox economists like Ha-Joon Chang, Mariana Mazzucato, and Stephanie Kelton are reshaping their discipline. And while Trump has dominated the headlines, there is still plenty of momentum around the socialist ideas that Bernie used to inspire America. Our Revolution is working hard to take the fight to the states; there it will be joined by groups like the Working Families Party and the Democratic Socialists of America, whose membership has grown by more than 50 percent since November 8. That’s more than 4,000 new members.

When I heard Bernie say, out loud, that the billionaire class was ruthless and exploitative, that sounded groundbreaking. Not only did he name the right problem—inequality, not poverty—he named the culprit. I didn’t know you could do that. To me, and to hundreds of thousands of my peers, Sanders’s (and Corbyn’s) socialism doesn’t feel antiquated. Instead, it feels fresh and vital precisely because it has been silenced for so long—and because we need it now more than ever.

My dad—slightly younger and slightly cooler than Sanders and Corbyn—picked me up from the airport the day before Thanksgiving. In the car, he confessed: “I liked a lot of the things Bernie had to say, but I just didn’t think he could get elected.” He sighed, ran a hand through his white hair, and pushed his glasses up his nose. “I thought Hillary had a better shot, but she couldn’t pull it off. Maybe Bernie could have… Wisconsin, Michigan, Ohio…”

My dad sounded humble. Trump’s election, which to so many of us feels like a tragedy, prompted him to consider a new way of thinking. Maybe socialism isn’t a lost cause after all. Maybe it’s our best hope.

47% of Jobs Will Disappear in the next 25 Years

Article Image
British Musicians. Ms. Dynamite. Getty Images.

The Trump campaign ran on bringing jobs back to American shores, although mechanization has been the biggest reason for manufacturing jobs’ disappearance. Similar losses have led to populist movements in several other countries. But instead of a pro-job growth future, economists across the board predict further losses as AI, robotics, and other technologies continue to be ushered in. What is up for debate is how quickly this is likely to occur.

Now, an expert at the Wharton School of Business at the University of Pennsylvania is ringing the alarm bells. According to Art Bilger, venture capitalist and board member at the business school, all the developed nations on earth will see job loss rates of up to 47% within the next 25 years, according to a recent Oxford study. “No government is prepared,”The Economist reports. These include blue and white collar jobs. So far, the loss has been restricted to the blue collar variety, particularly in manufacturing.

To combat “structural unemployment” and the terrible blow it is bound to deal the American people, Bilger has formed a nonprofit called Working Nation, whose mission it is to warn the public and to help make plans to safeguard them from this worrisome trend. Not only is the entire concept of employment about to change in a dramatic fashion, the trend is irreversible. The venture capitalist called on corporations, academia, government, and nonprofits to cooperate in modernizing our workforce.

To be clear, mechanization has always cost us jobs. The mechanical loom for instance put weavers out of business. But it’s also created jobs. Mechanics had to keep the machines going, machinists had to make parts for them, and workers had to attend to them, and so on. A lot of times those in one profession could pivot to another. At the beginning of the 20thcentury for instance, automobiles were putting blacksmiths out of business. Who needed horseshoes anymore? But they soon became mechanics. And who was better suited?

A Toyota plant, Japan. Manufacturing is almost fully automated today and so many other jobs are not far behind.

Not so with this new trend. Unemployment today is significant in most developed nations and it’s only going to get worse. By 2034, just a few decades, mid-level jobs will be by and large obsolete. So far the benefits have only gone to the ultra-wealthy, the top 1%. This coming technological revolution is set to wipe out what looks to be the entire middle class. Not only will computers be able to perform tasks more cheaply than people, they’ll be more efficient too.

Accountants, doctors, lawyers, teachers, bureaucrats, and financial analysts beware: your jobs are not safe. According to The Economist, computers will be able to analyze and compare reams of data to make financial decisions or medical ones. There will be less of a chance of fraud or misdiagnosis, and the process will be more efficient. Not only are these folks in trouble, such a trend is likely to freeze salaries for those who remain employed, while income gaps only increase in size. You can imagine what this will do to politics and social stability.

Mechanization and computerization cannot cease. You can’t put the genie back in the bottle. And everyone must have it, eventually. The mindset is this: other countries would use such technology to gain a competitive advantage and therefore we must adopt it. Eventually, new tech startups and other business might absorb those who have been displaced. But the pace is sure to move far too slowly to avoid a major catastrophe.

According to Bilger, the problem has been going on for a long time. Take into account the longevity we are enjoying nowadays and the US’s broken education system and the problem is compounded. One proposed solution is a universal basic income doled out by the government, a sort of baseline one would receive for survival. After that, re-education programs could help people find new pursuits. Others would want to start businesses or take part in creative enterprises. It could even be a time of the flowering of humanity, when instead of chasing the almighty dollar, people would able to pursue their true passions.

On a recent radio program, Bilger talked about retooling the education system in its entirety, including adding classes that are sure to transfer into the skills workers need for the jobs that will be there. He also discussed the need to retrain middle-aged workers so that they can participate in the economy, rather than be left behind. Bilger said that “projects are being developed for that.” Though he admits that many middle-aged workers are resistant to reentering the classroom, Bilger says it’s necessary. What’s more, they are looking at ways of making the classroom experience more dynamic, such as using augmented reality for retraining purposes, as well as to reinvent K-12 education. But such plans are in the seminal stages.

Widespread internships and apprenticeships are also on the agenda. Today, the problem, as some contend, is not that there aren’t enough jobs, but that there aren’t enough skilled workers to fill the positions that are available. Bilger seems to think that this problem will only grow more substantial.

But would those who drive for a living, say long haul truckers and cab drivers, really find a place in the new economy with retraining, once self-driving vehicles become pervasive? No one really knows. Like any major shift in society, there are likely to be winners and losers. This pivot point contains the seeds for a pragmatic utopia, or complete social upheaval, but is likely to fall somewhere between.

Bilger ended the interview saying, “What would our society be like with 25%, 30% or 35% unemployment? … I don’t know how you afford that, but even if you could afford it, there’s still the question of, what do people do with themselves? Having a purpose in life is, I think, an important piece of the stability of a society.”

 

 

Carrie Fisher and the Star Wars phenomenon

By David Walsh
29 December 2016

The death of actress Carrie Fisher on Tuesday at the relatively young age of 60, several days after suffering a heart attack aboard a flight from London to Los Angeles, has evoked expressions of grief from her many fans. The sadness over Fisher’s passing is compounded by the sudden death, just one day later, of her 84-year-old mother, the well-known actress Debbie Reynolds.

Carrie FIsher

Carrie Fisher achieved success not only as an actress but also as a writer and humorist. She was an appealing figure and personality. The daughter of actress Debbie Reynolds and singer Eddie Fisher, Fisher grew up in the entertainment business. When she was born in Beverly Hills in 1956, her mother was one of the biggest stars in Hollywood. Reynolds also had a successful recording career.

Carrie Fisher bore the numerous scars of this upbringing and this milieu, characterized by intense insecurity, instability and self-involvement. It is easy to scoff at the difficulties of someone who grows up in this affluent world, but the list of children of film, television and music stars who have done themselves in, one way or another, is tragically long. Fisher did not suffer that fate, but she certainly suffered. Her struggles with drugs and emotional problems are well-known.

At the age of 19, Fisher landed a leading role in the first Star Wars film (directed by George Lucas), as Princess Leia. She played the same role in two other films in the first series, and showed up again in one of the many sequels, Star Wars: The Force Awakens, in 2015. She also appeared in several dozen other films, mostly in smaller parts.

Fisher also wrote several books, the best known of which is Postcards from the Edge (1987), a thinly and comically disguised portrait of her mother and herself. The novel was made into a mediocre film (1990) directed by Mike Nichols, with Meryl Streep and Shirley MacLaine. Reportedly, Fisher made a living in the 1990s as a “script doctor,” repairing or improving other people’s screenplays.

More recently, she adapted her memoir, Wishful Drinking (2008), into a one-woman show, which had some success in theaters in 2009-10. It was made into a television documentary and released by HBO in September 2011.

Fisher specialized, in her writings, in bringing out the surreal aspects of life as the child of “celebrities,” and then as a celebrity herself. There is a certain self-mocking and self-deprecating charm to her work. She could capture the desperation and absurdity of the pursuit of stardom, of those hoping “to get out of the anonymous frying pan and into the Hollywood fire”—and enumerate its tremendous psychic costs.

Fisher became an amused, skeptical observer of Hollywood, but not its mortal enemy. In another, more radicalized era perhaps, her insight and anger might have carried her much further to the left. As it was, in the stagnant 1980s and 90s, she didn’t travel terribly far. One has the sense that the overall social and artistic conditions never permitted Fisher to look with sufficiently objective and critical eyes at the milieu in which she grew up. She always remained tied to it by numerous strings.

In this age of celebrity worship, it comes as no surprise that the media coverage of Fisher’s death is out of all proportion to her actual achievements. No disrespect is intended here. But an honest evaluation of her career and talent could not avoid the conclusion that Fisher was not a major figure in the history of American cinema. Nevertheless, substantial portions of the national news have been devoted to her passing. In death, we discover, that she is an “icon,” a “legend,” and so forth. One suspects that Fisher herself would have laughed at this sort of media blather.

A.O. Scott, the New York Times film critic, enthused (in “Carrie Fisher, a Princess, a Rebel and a Brave Comic Voice”) that Fisher “entered popular culture as a princess in peril and endures as something much more complicated and interesting. Many things, really: a rebel commander; a witty internal critic of the celebrity machine; a teller of comic tales, true and embellished; an inspiring and cautionary avatar of excess and resilience; an emblem of the honesty we crave (and so rarely receive) from beloved purveyors of make-believe.” This is over the top, unnecessarily and substantially so.

The claims for Fisher are only partly inspired by her career, less than the individual writers and eulogizers may think. Much of the over-praise and flattery has to do with the Star Wars franchise itself and its enduring impact. The various commentators are pumping up this “legend” of a franchise as a means of elevating and legitimizing the last several decades of American filmmaking, without question the weakest decades in its history.

Whatever the intentions of George Lucas and others, and they may have been relatively innocent and light-hearted to begin with, there is no question but that Star Wars helped mark the transition in cinema terms to a period of banalization and decay.

The Oxford History of World Cinema explains: “The Hollywood film industry entered a new age in June 1975, with the release of Steven Spielberg’s Jaws. Two years later, George Lucas’s Star Wars spectacularly confirmed that a single film could earn its studio hundreds of millions of dollars in profits, and convert a poor year into a triumph. The place of movies within the Hollywood production system changed: increasingly the focus was on high-cost, potentially highly lucrative ‘special attractions.’”

Walter Metz, in the Cambridge Companion to Modern American Culture, argues that “ Star Wars fundamentally changed Hollywood filmmaking at the aesthetic and narrative level but, in terms of the industry, merely returned the business toward the production of big-budget, mass audience blockbusters.”

Critic Robin Wood, discussing the “Lucas-Spielberg Syndrome,” notes that what was “worrying” about the phenomenon was the “enormous importance our society has conferred upon the [Star Wars] films.” The old serials made in the 1940s, which Star Wars was supposedly inspired by, had a “minor and marginal” role in the culture, Wood pointed out, and thus “they posed no threat to the co-existence of challenging, disturbing or genuinely distinguished Hollywood movies, which they often accompanied in their lowly capacity. Today it is becoming difficult for films that are not like Star Wars … to get made.”

This process is far more advanced today. Of course, the filmmakers were not responsible for the growing social indifference and turn to the right by substantial sections of the middle class. They merely reflected and carried forward the process. But there is no reason to mythologize Fisher’s Princess Leia, much less the Star Wars series as a whole.

 

How Social Isolation Is Killing Us

My patient and I both knew he was dying.

Not the long kind of dying that stretches on for months or years. He would die today. Maybe tomorrow. And if not tomorrow, the next day. Was there someone I should call? Someone he wanted to see?

Not a one, he told me. No immediate family. No close friends. He had a niece down South, maybe, but they hadn’t spoken in years.

For me, the sadness of his death was surpassed only by the sadness of his solitude. I wondered whether his isolation was a driving force of his premature death, not just an unhappy circumstance.

Every day I see variations at both the beginning and end of life: a young man abandoned by friends as he struggles with opioid addiction; an older woman getting by on tea and toast, living in filth, no longer able to clean her cluttered apartment. In these moments, it seems the only thing worse than suffering a serious illness is suffering it alone.

Social isolation is a growing epidemic — one that’s increasingly recognized as having dire physical, mental and emotional consequences. Since the 1980s, the percentage of American adults who say they’re lonely has doubled from 20 percent to 40 percent.

About one-third of Americans older than 65 now live alone, and half of those over 85 do. People in poorer health — especially those with mood disorders like anxiety and depression — are more likely to feel lonely. Those without a college education are the least likely to have someone they can talk to about important personal matters.

A wave of new research suggests social separation is bad for us. Individuals with less social connection have disrupted sleep patterns, altered immune systems, more inflammation and higher levels of stress hormones. One recent study found that isolation increases the risk of heart disease by 29 percent and stroke by 32 percent.

Another analysis that pooled data from 70 studies and 3.4 million people found that socially isolated individuals had a 30 percent higher risk of dying in the next seven years, and that this effect was largest in middle age.

Loneliness can accelerate social connection  in older adults, and isolated individuals are twice as likely to die prematurely as those with more robust social interactions. These effects start early: Socially isolated children have significantly poorer health 20 years later, even after controlling for other factors. All told, loneliness is as important a risk factor for early death as obesity and smoking.

The evidence on social isolation is clear. What to do about it is less so.

Loneliness is an especially tricky problem because accepting and declaring our loneliness carries profound stigma. Admitting we’re lonely can feel as if we’re admitting we’ve failed in life’s most fundamental domains: belonging, love, attachment. It attacks our basic instincts to save face, and makes it hard to ask for help.

I see this most acutely during the holidays when I care for hospitalized patients, some connected to I.V. poles in barren rooms devoid of family or friends — their aloneness amplified by cheerful Christmas movies playing on wall-mounted televisions. And hospitalized or not, many people report feeling lonelier, more depressed and less satisfied with life during the holiday season.

New research suggests that loneliness is not necessarily the result of poor social skills or lack of social support, but can be caused in part by unusual sensitivity to social cues. Lonely people are more likely to perceive ambiguous social cues negatively, and enter a self-preservation mind-set — worsening the problem. In this way, loneliness can be contagious: When one person becomes lonely, he withdraws from his social circle and causes others to do the same.

Dr. John Cacioppo, a psychology professor at the University of Chicago, has tested various approaches to treat loneliness. His work has found that the most effective interventions focus on addressing “maladaptive social cognition” — that is, helping people re-examine how they interact with others and perceive social cues. He is collaborating with the United States military to explore how social cognition training can help soldiers feel less isolated while deployed and after returning home.

The loneliness of older adults has different roots — often resulting from family members moving away and close friends passing away. As one senior put it, “Your world dies before you do.”

Ideally, experts say, neighborhoods and communities would keep an eye out for such older people and take steps to reduce social isolation. Ensuring they have easy access to transportation, through discounted bus passes or special transport services, can help maintain social connections.

Religious older people should be encouraged to continue regular attendance at services and may benefit from a sense of spirituality and community, as well as the watchful eye of fellow churchgoers. Those capable of caring for an animal might enjoy the companionship of a pet. And loved ones living far away from a parent or grandparent could ask a neighbor to check in periodically.

But more structured programs are arising, too. For example, Dr. Paul Tang of the Palo Alto Medical Foundation started a program called linkAges, a cross-generational service exchange inspired by the idea that everyone has something to offer.

The program works by allowing members to post online something they want help with: guitar lessons, a Scrabble partner, a ride to the doctor’s office. Others can then volunteer their time and skills to fill these needs and “bank” hours for when they need something themselves.

“In America, you almost need an excuse for knocking on a neighbor’s door,” Dr. Tang told me. “We want to break down those barriers.”

For example, a college student might see a post from an older man who needs help gardening. She helps him plant a row of flowers and “banks” two hours in the process. A few months later, when she wants to cook a Malaysian meal for her boyfriend, a retired chef comes by to give her cooking lessons.

“You don’t need a playmate every day,” Dr. Tang said. “But knowing you’re valued and a contributing member of society is incredibly reaffirming.”

The program now has hundreds of members in California and plans to expand to other areas of the country.

“We in the medical community have to ask ourselves: Are we controlling blood pressure or improving health and well-being?” Dr. Tang said. “I think you have to do the latter to do the former.”

A great paradox of our hyper-connected digital age is that we seem to be drifting apart. Increasingly, however, research confirms our deepest intuition: Human connection lies at the heart of human well-being. It’s up to all of us — doctors, patients, neighborhoods and communities — to maintain bonds where they’re fading, and create ones where they haven’t existed.

Correction: December 24, 2016
An Upshot article on Thursday about the health risks of social isolation misstated the purpose of a grant by the Robert Wood Johnson Foundation to a program, linkAges, dedicated to fighting the problem. The grant to linkAges was for testing a new project connected to the program; it was not meant to help linkAges expand across other areas of the country.