When Robots Replace Workers

An article in the Harvard Business Review by William H. Davidow and Michael S. Malone suggests: “The “Second Economy” (the term used by economist Brian Arthur to describe the portion of the economy where computers transact business only with other computers) is upon us. It is, quite simply, the virtual economy, and one of its main byproducts is the replacement of workers with intelligent machines powered by sophisticated code. … This is why we will soon be looking at hordes of citizens of zero economic value. Figuring out how to deal with the impacts of this development will be the greatest challenge facing free market economies in this century. … Ultimately, we need a new, individualized, cultural, approach to the meaning of work and the purpose of life. Otherwise, people will find a solution — human beings always do — but it may not be the one for which we began this technological revolution.”

This follows the recent discussion of “Economists Say Newest AI Technology Destroys More Jobs Than It Creates” citing a NY Times article and other previous discussions like Humans Need Not Apply. What is most interesting to me about this HBR article is not the article itself so much as the fact that concerns about the economic implications of robotics, AI, and automation are now making it into the Harvard Business Review. These issues have been otherwise discussed by alternative economists for decades, such as in the Triple Revolution Memorandum from 1964 — even as those projections have been slow to play out, with automation’s initial effect being more to hold down wages and concentrate wealth rather than to displace most workers. However, they may be reaching the point where these effects have become hard to deny despite going against mainstream theory which assumes infinite demand and broad distribution of purchasing power via wages.

As to possible solutions, there is a mention in the HBR article of using government planning by creating public works like infrastructure investments to help address the issue. There is no mention in the article of expanding the “basic income” of Social Security currently only received by older people in the U.S., expanding the gift economy as represented by GNU/Linux, or improving local subsistence production using, say, 3D printing and gardening robots like Dewey of “Silent Running.” So, it seems like the mainstream economics profession is starting to accept the emerging reality of this increasingly urgent issue, but is still struggling to think outside an exchange-oriented box for socioeconomic solutions. A few years ago, I collected dozens of possible good and bad solutions related to this issue. Like Davidow and Malone, I’d agree that the particular mix we end up will be a reflection of our culture. Personally, I feel that if we are heading for a technological “singularity” of some sort, we would be better off improving various aspects of our society first, since our trajectory going out of any singularity may have a lot to do with our trajectory going into it.

WINTER SOLSTICE

slide_271316_1911483_free

 

In 2014, the winter solstice in the Northern Hemisphere will begin on Dec. 21 at 6:03 p.m. EST. To calculate the turning point in your time zone, click here.

Officially the first day of winter, the winter solstice occurs when the North Pole is tilted 23.5 degrees away from the sun. This is the longest night of the year, meaning that despite the cold winter, the days get progressively longer after the winter solsticeuntil the summer solstice in 2015.

The winter solstice is celebrated by many people around the world as the beginning of the return of the sun, and darkness turning into light. The Talmud recognizes the winter solstice as “Tekufat Tevet.” In China, the Dongzhi Festival is celebrated on the Winter Solstice by families getting together and eating special festive food.

Until the 16th century, the winter months were a time of famine in northern Europe. Most cattle were slaughtered so that they wouldn’t have to be fed during the winter, making the solstice a time when fresh meat was plentiful. Most celebrations of thewinter solstice in Europe involved merriment and feasting. In pre-Christian Scandinavia, the Feast of Juul, or Yule, lasted for 12 days celebrating the rebirth of the sun god and giving rise to the custom of burning a Yule log.

In ancient Rome, the winter solstice was celebrated at the Feast of Saturnalia, to honor Saturn, the god of agricultural bounty. Lasting about a week, Saturnalia was characterized by feasting, debauchery and gift-giving. With Emperor Constantine’s conversion to Christianity, many of these customs were later absorbed into Christmas celebrations.

One of the most famous celebrations of the winter solstice in the world today takes place in the ancient ruins of Stonehenge, England. Thousands of druids and pagans gather there to chant, dance and sing while waiting to see the spectacular sunrise.

Pagan author T. Thorn Coyle wrote in a 2012 HuffPost article that for many contemporary celebrants, solstices “are a chance to still ourselves inside, to behold the glory of the cosmos, and to take a breath with the Sacred.”

In the Northern hemisphere, friends gather to celebrate the longest night. We may light candles, or dance around bonfires. We may share festive meals, or sing, or pray. Some of us tell stories and keep vigil as a way of making certain that the sun will rise again. Something in us needs to know that at the end of the longest night, there will be light.

In connecting with the natural world in a way that honors the sacred immanent in all things, we establish a resonance with the seasons. Ritual helps to shift our consciousness to reflect the outer world inside our inner landscape: the sun stands still within us, and time changes. After the longest night, we sing up the dawn. There is a rejoicing that, even in the darkest time, the sun is not vanquished. Sol Invictus — the Unconquered Sun — is seen once again, staining the horizon with the promise of hope and brilliance.

We need to talk about death

Why ignoring our darkest fears only makes them worse

It’s a universal human experience. So why do we act like we need to confront it alone?

We need to talk about death: Why ignoring our darkest fears only makes them worse
(Credit: P_Wei via iStock)

“I don’t want to die. It’s so permanent.”

So said my terminally ill grandmother, a kick-ass woman who made life-size oil paintings and drank vermouth on the rocks every afternoon.

This isn’t an anecdote I’d be likely to mention in regular conversation with friends. Talk about ruining everyone’s good time. (“Ick, that’s so morbid,” everyone would think.) But earlier this month, the New York Times released its 100 Notable Books of 2014, and among the notables was not one but two – two! – nonfiction titles about death. This seemingly unremarkable milestone is actually one that we should celebrate with a glass of champagne. Or, better yet, with vermouth.

Right now our approach to death, as a culture, is utterly insane: We just pretend it doesn’t exist. Any mention of mortality in casual conversation is greeted with awkwardness and a subject change. That same taboo even translates into situations where the concept of death is unavoidable: After losing a loved one, the bereaved are granted a few moments of mourning, after which the world around them kicks back into motion, as if nothing at all had changed. For those not personally affected by it, the reality of death stays hidden and ignored.

For me this isn’t an abstract topic. There’s been a lot of death in my life. There was my grandmother’s recent death, which sent my whole crazy family into a tailspin; but also my dad’s sudden death when I was 20. Under such circumstances (that is, the unexpected sort), you quickly discover that no one has any clue whatsoever how to deal with human mortality.

“Get through this and we’ll get through the worst of it,” someone said to me at my dad’s funeral, as if the funeral itself was death’s greatest burden, and not the permanent absence of the only dad I’ll ever have.

Gaffes like that are common. But insensitivity is just a symptom of much deeper issues, first of which is our underlying fear of death, a fear that might only boil to the surface when we’re directly confronted by it, but stays with us even as we try our best to ignore it. It’s a fear that my grandmother summed up perfectly when she was dying — the terror of our own, permanent nonexistence. Which makes sense. After all, it’s our basic biological imperative to survive. But on top of that natural fear of death, there’s another, separate issue: our unwillingness, as a culture, to shine a light on that fear, and talk about it. And as a result, we keep this whole huge part of the human experience cloistered away.



“We’re literally lacking a vocabulary to talk about [death],” said Lennon Flowers, a co-founder of an organization called the Dinner Party, which brings together 20- and 30-somethings who have lost a loved one to discuss “the ways in which it continues to affect our lives.”

That lack of vocabulary is a big problem, and not just for people who directly experience loss. It’s a problem for all of us, because it means we each grapple alone with the natural fear of our own expiry. We deny the fear, we bury it under an endless stream of distractions. And so it festers, making us all the more invested in keeping it buried, for how painful it would be to take it out and look at it after letting it rot for so long.

But why all the self-enforced agony? Maybe it’s because a more honest relationship with death would mean a more honest reckoning with our lives, calling into question the choices we’ve made and the ways we’ve chosen to live. And damn if that isn’t uncomfortable.

Of course, if there’s one thing our culture is great at, it’s giving instruction on how to live. There are the clichés — “live each day to the fullest” and “dance like no one’s watching” — and beyond them an endless stream of messages telling us how to look better, feel better, lose weight, have better sex, get promoted, flip houses, and make a delicious nutritious dinner in 30 minutes flat. But all of it is predicated on the notion that life is long and death is some shadowy thing that comes along when we hit 100. (And definitely not one minute before then!)

To get a sense of how self-defeating each of these goals can be, consider this chestnut given to us by a Native American sage by the name of Crazy Horse:

“Today is a good day to die, for all the things of my life are present.”

No, today is not a good day to die, because most of us feel we haven’t lived our lives yet. We run around from one thing to the next. We have plans to buy a house or a new car or, someday, to pursue our wildest dreams. We rush through the day to get to the evening, and through the week to get to the weekend, but once the weekend comes, we’re already thinking ahead to Monday morning. Our lives are one deferral after another.

Naturally, then, today isn’t a good day to die. How about tomorrow? Probably not. What number of days would we need to be comfortable saying what Crazy Horse said? Probably too big a number to count. We preserve the idea of death as an abstract thing that comes in very old age, rather than a constant possibility for us as fragile humans, because we build our whole lives atop that foundation.

What would we gain from finally opening up about death? How about the golden opportunity to consider what’s really important, not to mention the chance to be less lonely as we grapple with our own mortality, and the promise of being a real friend when someone we love loses someone they love. Plus it would all come back to us tenfold whenwe’re the ones going through a loss or reeling from a terminal diagnosis.

Sounds like a worthy undertaking, doesn’t it?

And that’s where there’s good news. Coming to grips with death is, as we’ve already established, really hard. But we at least have a model for doing so. Let’s consider, for example, the Times notable books I mentioned earlier. One of them, the graphic memoir “Can’t We Talk About Something More Pleasant,” provides an especially honest — and genuinely funny — account of author Roz Chast’s experience watching her parents grow old and die. The other book, Atul Gawande’s “Being Mortal,” reveals just how much even our medical establishment struggles with the end of life. Doctors are trained to treat sickness, of course, but often have little or no training in what to do when sickness is no longer treatable.

What both of these books do especially well is provide a vocabulary for articulating just how difficult a subject death can be for everyone — even the strongest and brightest among us. As a universal human experience, it isn’t something we should have to deal with alone. It doesn’t make a person weak or maladjusted just because he or she struggles openly with death. And what Chast and Gawande both demonstrate is that talking about it doesn’t have to be awkward or uncomfortable, because these are anxieties that all of us have in common.

It’s a common refrain that what distinguishes humans from other animals is that humans can understand, on a rational level, the full magnitude of our mortality. But what also distinguishes humans is the richness of our relationships and the depths of our empathy — the ability we have to communicate our experiences and support those around us. Death is a deeply unsettling prospect, no matter who you are. But it doesn’t need to be a burden you face alone.

The following is a list of resources for those looking for an organized platform to discuss the topic of death:

  • Atul Gawande serves as an advisor to the Conversation Project, a site that encourages families to talk openly about end-of-life care — and to choose, in advance, whether they want to be at home or in a hospital bed, on life support or not — in short, to say in unequivocal terms what matters most when the end is near.
  • Vivian Nunez is the 22-year-old founder of a brand-new site called Too Damn Young. Nunez lost her mom when she was 10 and her grandmother – her second mother – 11 years later. “Losing someone you love is an extraordinarily isolating experience,” she said. “This is especially significant when you’re talking about teenagers, or a young adult, who loses someone at a young age, and is forced to face how real mortality is, and then not encouraged to talk about it.” She founded Too Damn Young so that bereaved teenagers will know they’re not alone and so they’ll have a public space to talk about it.
  • The Recollectors is a groundbreaking project by writer Alysia Abbott, that tells the stories of people who lost a parent to AIDS. She’s exploding two big taboos – death and AIDS – in one clean shot.
  • Get Your Shit Together is another great one, a site launched by a young widow who learned the hard way that everyone should take some key steps to get their financial matters in order in case of an untimely death. “I (mostly) have my shit together,” the site’s founder says. “Now it’s your turn.”
  • There’s also Death Cafe, dedicated to “increasing awareness of death with a view to helping people make the most of their (finite) lives.” And Modern Loss, a site that’s received coverage from the New York Times and the Washington Post, shies away from nothing in its quest to tell stories about end of life and living with loss. “Death Cafe and Modern Loss have attracted a loyal following,” said Nicole Bélanger, author of “Grief in the Rearview: Three Motherless Years.” “They offer the safe space we crave to show up as we are, without worrying about having to polish up our grief and make it fit for public consumption.”

Perhaps these communities will start to influence the mainstream, as their emboldened members teach the rest of us that it’s OK, it’s really OK, to talk about death. If that happens, it will be a slow process – culture change always is. “Race and gender and myriad other subjects were forever taboo, but now we’re able to speak truth,” said Flowers of the Dinner Party. “And now we’re seeing that around death and dying.”

If she’s right, it’s the difference between the excruciating loneliness of hiding away our vulnerabilities and, instead, allowing them to connect us and bind us together.

In 2009, the president promised nuclear disarmament. Five years later, our stockpile remains frightfully intact

Obama channels Dr. Stangelove: How the president learned to stop worrying and love the bomb

, TOMDISPATCH.COM

Obama channels Dr. Stangelove: How the president learned to stop worrying and love the bomb

This piece originally appeared on TomDispatch.

Mark these days. A long-dreaded transformation from hope to doom is taking place as the United States of America ushers the world onto the no-turning-back road of nuclear perdition. Once, we could believe there was another way to go. Indeed, we were invited to take that path by the man who is, even today, overseeing the blocking of it, probably forever.

It was one of the most stirring speeches an American president had ever given. The place was Prague; the year was 2009; the president was the recently sworn in Barack Obama. The promise made that day is worth recalling at length, especially since, by now, it is largely forgotten:

“As the only nuclear power to have used a nuclear weapon, the United States has a moral responsibility to act… So today, I state clearly and with conviction America’s commitment to seek the peace and security of a world without nuclear weapons. I’m not naive. This goal will not be reached quickly — perhaps not in my lifetime. It will take patience and persistence. But now, we, too, must ignore the voices who tell us that the world cannot change. We have to insist, ‘Yes, we can…’”

President Obama had been in office only three months when, boldly claiming his place on the world stage, he unequivocally committed himself and his country to a nuclear abolition movement that, until then, had at best existed somewhere on the distant fringes of power politics. “I know,” he added, “that there are some who will question whether we can act on such a broad agenda. There are those who doubt whether true international cooperation is possible… and there are those who hear talk of a world without nuclear weapons and doubt whether it’s worth setting a goal that seems impossible to achieve. But make no mistake. We know where that road leads.”

The simple existence of nuclear weapons, an American president declared, paved the road to perdition for humanity.

Obama as The Captain Ahab of Nuclear Weapons

At that moment, the foundations for an imagined abolitionist world were modest indeed, but not nonexistent.  The 1968 Nuclear Non-Proliferation Treaty (NPT) had, for instance, struck a bargain between nuclear haves and have-nots, under which a path to abolition was treated as real.  The dealseemed clear enough: the have-nots would promise to forego obtaining nukes and, in return, the world’s reigning nuclear powers would pledge to take, in the words of the treaty, “effective measures in the direction of nuclear disarmament.”



For decades before the Obama moment, however, the superpower arsenals of nuclear warheads continued to grow like so many mushrooms, while new nuclear states — Israel, Pakistan, India, North Korea — built their own impressive arsenals.  In those years, with the singular exception of South Africa, nuclear-weapons states simply ignored their half of the NPT bargain and the crucial clause mandating progress toward eventual disarmament was all but forgotten.

When the Cold War ended in 1991 with the disappearance of the Soviet Union, and the next year Americans elected as president Bill Clinton, who was famously against the Vietnam War, it was at least possible to imagine that nukes might go the way of internationally banned chemical weapons. But Washington chose otherwise.  Despite a paucity of enemies anywhere on Earth, the Pentagon’s 1994 Nuclear Posture Review insisted on maintaining the American nuclear arsenal at Cold War levels as a “hedge,” an insurance policy, against an imagined return of Communism, fascism, or something terrible in Russia anyway — and Clinton accepted the Pentagon’s position.

Soon enough, however, even prominent hawks of the Cold War era began to worry that such a nuclear insurance policy could itself ignite a global fire. In 1999, a chief architect of the nuclear mindset, Paul Nitze, stepped away from a lifetime obsession with building up nuclear power to denounce nukes as “a threat mostly to ourselves” and to explicitly call for unilateral disarmament. Other former apostles of nuclear realpolitik also came to embrace the goal of abolition. In 2008, four high priests of the cult of nuclear normalcy — former Senator Sam Nunn, former Secretary of Defense William J. Perry, and former Secretaries of State George Schultz and Henry Kissinger — jointly issued a sacrilegious renunciation of their nuclear faith on the Wall Street Journal’s editorial page. “We endorse setting the goal of a world free of nuclear weapons,” they wrote, “and working energetically on the actions required to achieve that goal.”

Unfortunately, such figures had come to Jesus only after leaving office, when they were exempt from the responsibility of matching their high-flown rhetoric with the gritty work of making it real.

Obama in Prague was another matter.  He was at the start of what would become an eight-year presidency and his rejection of nuclear fatalism rang across the world. Only months later, he was awarded the Nobel Peace Prize, in large part because of this stunning commitment. A core hope of the post-World-War-II peace movement, always marginal, had at last been embraced in the seat of power. A year later, at Obama’s direction, the Pentagon, in its 2010 Nuclear Posture Review, actually advanced the president’s purpose, committing itself to “a multilateral effort to limit, reduce, and eventually eliminate all nuclear weapons worldwide.”

“The United States,” that document promised, “will not develop new nuclear warheads.” When it came to the future of the nuclear arsenal, a program of responsible maintenance was foreseen, but no new ground was to be broken. “Life Extension Programs,” the Pentagon promised, “will use only nuclear components based on previously tested designs, and will not support new military missions or provide new military capabilities.”

Obama’s timing in 2009 was critical. The weapons and delivery systems of the nuclear arsenal were aging fast. Many of the country’s missiles, warheads, strategic bombers, and nuclear-powered submarines dated back to the early Cold War era and were effectively approaching their radioactive sell-by dates. In other words, massive reductions in the arsenal had to begin before pressures to launch a program for the wholesale replacement of those weapons systems grew too strong to resist.  Such a program, in turn, would necessarily mean combining the latest technological innovations with ever greater lethality in a way guaranteed to reinvigorate the entire enterprise across the world — the polar opposite of “effective measures in the direction of nuclear disarmament.”

Obama, in other words, was presiding over a golden moment, but an apocalyptic deadline was bearing down. And sure enough, that deadline came crashing through when three things happened: Vladimir Putin resurfaced as an incipient fascist intent on returning Russia to great power status; extremist Republicans took Congress hostage; and Barack Obama found himself lashed, like Herman Melville’s Captain Ahab, to “the monomaniac incarnation of all those malicious agencies which some deep men feel eating in them, till they are left living on half a heart and half a lung.” Insiders often compare the Pentagon to Moby Dick, the Great White Whale, and Obama learned why. The peaceful intentions with which he began his presidency were slapped away by the flukes of the monster, like so many novice oarsmen in a whaling skiff.

Hence Obama’s course reversals in Iraq, Afghanistan, and Syria; hence the White House stumbles, including an unseemly succession of secretaries of defense, the fourth of whom, Ashton Carter, can reliably be counted on to advance the renewal of the nuclear force. The Pentagon’s “intangible malignity,” in Melville’s phrase, was steadily quickened by both Putin and the Republicans, but Obama’s half-devoured heart shows in nothing so much as his remarkably full-bore retreat, in both rhetoric and policy, from the goal of nuclear abolition.

recent piece by New York Times science correspondent William J. Broad made the president’s nuclear failure dramatic. Cuts to the U.S. nuclear stockpile initiated by George H.W. Bush and George W. Bush, he pointed out, totaled 14,801 weapons; Obama’s reductions so far: 507 weapons. In 2010, a new START treaty between Moscow and Washington capped future deployed nukes at 1,500. As of this October, the U.S. still deploys 1,642 of them and Russia 1,643; neither nation, that is, has achieved START levels, which only count deployed weapons. (Including stored but readily re-armed and targeted nukes, the U.S. arsenal today totals about 4,800 weapons.)

In order to get the votes of Senate Republicans to ratify the START treaty, Obama made what turned out to be a devil’s bargain.  He agreed to lay the groundwork for a vast “modernization” of the U.S. nuclear arsenal, which, in the name of updating an aged system, is already morphing into a full-blown reinvention of the arms cache at an estimated future cost of more than atrillion dollars. In the process, the Navy wants, and may get, 12 new strategic submarines; the Air Force wants, and may get, a new long-range strike bomber force. Bombers and submarines would, of course, both be outfitted with next-generation missiles, and we’d be off to the races. The arms races.

All of this unfolds as Vladimir Putin warms the hearts of nuclear enthusiasts everywhere not only by his aggressions in Ukraine, but also by undercutting the landmark 1987 Intermediate-Range Nuclear Forces Treaty by testing a new ground-launched cruise missile. Indeed, just this fall, Russia successfully launched a new intercontinental ballistic missile. It seems that Moscow, too, can modernize.

On a Twenty-First Century Road to Perdition

Responding to the early Obama vision of “effective measures” toward nuclear disarmament, and following up on that 2010 Nuclear Posture Review, senior Pentagon officials pursued serious discussions about practical measures to reduce the nuclear arsenal. Leading experts advocated a shift away from the Cold War’s orgasmic strike targeting doctrine that still necessitates an arsenal of weapons counted in the thousands.

In fact, in response to budget constraints, legal obligations under a jeopardized non-proliferation treaty, and the most urgent moral mandate facing the country, America’s nuclear strategy could shift without wrenching difficulty, at the very least, to one of “minimal deterrence.” Hardcore national security mavens tell us this. Such a shift would involve a reduction in both the deployed and stored nuclear arsenal to something like 500 warheads. Even if that goal were pursued unilaterally, it would leave more than enough weaponry to deter any conceivable state-based nuclear threat, including Russia’s, no matter what Putin may do.

Five hundred is, of course, a long way from zero and so from the president’s 2009 goal of abolition, and yet opposition even to that level would be fierce in Washington. Though disarming and disposing of thousands of nukes would cost far less than replacement, it would still be expensive, and you can count on one thing: Pentagon nuclearists would find firm allies among congressional Republicans, who would be loathe to fund such a retreat from virtue’s Armageddon. Meanwhile, confronting such cuts, the defense industry’s samurai lobbyists would unsheathe their swords.

But if a passionate Obama could make a compelling case for a nuclear-free world from Prague in 2009, why not go directly to the American people and make the case today? There is, of course, no sign that the president intends to do such a thing any longer, but if a commander-in-chief were to order nuclear reductions into the hundreds, the result might actually be a transformation of the American political conscience. In the process, the global dream of a nuclear-free world could be resuscitated and the commitment of non-nuclear states (including Iran) to refrain from nuclear-weapons development could be rescued. Most crucially, there would no longer be any rationale for the large-scale reinvention of the American nuclear arsenal, a deadly project this nation is even now preparing to launch. At the very least, a vocal rededication to an ultimate disarmament, to the actual abolition of nuclear weapons, would keep that road open for a future president to re-embark upon.

Alas, Pentagon advocates of “minimal deterrence” have already been overridden. The president’s once fiercely held conviction is now a mere shadow of itself. As happened with Ahab’s wrecked whaling ship, tumultuous seas are closing over the hope that once seized the world’s attention. Take it for granted that, in retirement and out of power, ex-president Obama will rediscover his one-time commitment to a world freed from the nuclear nightmare. He will feel the special responsibility proper to a citizen of “the only nuclear power to have used a nuclear weapon.” The then-former president’s speeches on the subject will be riveting and his philanthropy will be sharply targeted. All for naught.

Because of decisions likely to be taken this year and next, no American president will ever again be able to embrace this purpose as Obama once did. Nuclear weapons will instead become a normalized and permanent part of the twenty-first century American arsenal, and therefore of the arsenals of many other nations; nuclear weapons, that is, will have become an essential element of the human future — as long as that future lasts.

So yes, mark these days down. Nuclear abolition itself is being abolished. Meanwhile, let us acknowledge, as that hopeful young president once asked us to, that we know where this road leads.

James Carroll is the bestselling author of the National Book Award-winning memoir “An American Requiem,” “Constantine’s Sword,” a history of Christian anti-Semitism and 10 novels. His latest book is “Jerusalem, Jerusalem: How the Ancient City Ignited Our Modern World.” He lectures widely on war and peace and on Jewish-Christian-Muslim reconciliation. He lives in Boston.

US budget resolution funds war and repression

117610

By Patrick Martin
13 December 2014

The omnibus spending resolution adopted by the US House of Representatives just before midnight Thursday, and which is now before the Senate, is a detailed public statement of the priorities of the American ruling elite. The bulk of the more than $1.1 trillion in funding goes to the military and other repressive functions of the federal government, such as spying, prisons and the police.

President Obama hailed the measure as a “bipartisan effort to include full-year appropriations legislation for most government functions that allows for planning and provides certainty, while making progress toward appropriately investing in economic growth and opportunity, and adequately funding national security requirements.” In other words, the bill makes it possible for the administration to continue waging war around the world and building up the apparatus for a police state at home.

Attached to the funding bill are hundreds of policy measures, many of them added at the last minute with no public discussion and, in many cases, without most congressmen or senators even being aware of what was being proposed before they rubber-stamped the bill. These include, most notoriously, the repeal of a major section of the Dodd-Frank legislation that sought to place some restrictions on the speculative activities of the banks following the 2008 financial crash.

The language in this section, permitting banks to use federally insured deposits to gamble in the swaps and derivative markets, was literally drafted by the banks. According to an analysis by the New York Times, 70 of the 85 lines in that section of the bill come directly from Citibank, which spearheaded the lobbying by Wall Street on this issue.

The four largest Wall Street banks conduct 93 percent of all US derivatives trading, so the measure is a brazen demonstration of the subservience of Congress to the big banks. According to the Washington Post, Jamie Dimon, CEO of JP Morgan Chase, another of the big four banks, personally telephoned individual congressmen to urge them to vote for the amendment to Dodd-Frank.

The House of Representatives passed the funding bill late Thursday by a vote of 219 to 206 after a delay of seven hours. The delay was to allow the Obama administration to pressure a sufficient number of Democratic congressmen to support the Republican-drafted bill and offset defections among ultra-right Republicans who wanted the legislation to block Obama’s executive order on immigration.

The final vote saw 162 Republicans and 57 Democrats supporting the bill, while 136 Democrats and 70 Republicans opposed it. As always, just enough Democratic votes were found to assure that the reactionary measure passed, the government agencies were funded, and the financial markets were reassured.

Some liberal Democrats, most notably the minority leader, Nancy Pelosi, made speeches posturing as opponents of the legislation. Pelosi even declared, in a comment that was widely publicized, that she was “enormously disappointed that the White House feels that the only way they can get a bill is to go along with this.”

But in remarks to a meeting of the Democratic caucus, Pelosi gave the game away, refusing to seek a party-line vote and instead telling members, “I’m giving you the leverage to do whatever you have to do.” The second-ranking and third-ranking Democratic leaders, Minority Whip Steny Hoyer and Deputy Whip James Clyburn, broke with Pelosi and sided with the White House on the bill, openly recruiting the votes required for passage.

Along with the $1.1 trillion bill that will fund most federal agencies through September 30, the House passed by voice vote a resolution funding the whole government through Saturday midnight, to give the Senate time to act on the main measure. The Senate approved this stopgap as well, and Obama signed it at the White House on Friday morning.

The House met again Friday afternoon and passed another extension, this time for five days, giving the Senate until midnight Wednesday to complete action on the funding legislation. Ultimate Senate passage is not in doubt. Outgoing Majority Leader Harry Reid has given his public backing, saying Thursday, “I’m upset with certain things in the bill. It’s not perfect. But a longer-term funding is much better for our economy than a short-term one.”

Most press coverage of the funding bill gives the following breakdown of the spending: $521 billion for the military, $492 billion for nonmilitary items, and $73 billion in emergency spending, most of it military-related. This is highly misleading, since much of the “nonmilitary” spending is demonstrably in support of US military operations or domestic police and security operations directed against the American population.

The $492 billion of “nonmilitary” spending includes the following, according to the official summary posted on the web site of Congress. (Click here and then page down to the section titled “Omnibus summaries,” which contains live links to department-by-department spending).

· $11.4 billion for the National Nuclear Security Administration, the unit of the Department of Energy that assembles US nuclear weapons.

· $40.6 billion for Department of Energy, NASA, NSF and other scientific research, much of it related to nuclear energy, cybersecurity and missile technology.

· $65 billion for the Veterans Administration, which provides medical care and other services for those shattered in body and mind by their service as cannon fodder in American wars.

· $26.7 billion for the Department of Justice, which includes the FBI, DEA and BATF ($10.7 billion), federal prisons ($6.9 billion), and aid to local police ($2.3 billion).

· $25 billion for the Department of Homeland Security, which is funded only through February 27, 2015 because of its role in enforcing immigration policy (the full-year amount would be more than $60 billion).

· $7 billion from the health budget for biodefense and bioterrorism research.

· An undisclosed figure, believed to be in the range of $60 billion, for intelligence operations, including the CIA and 17 other federal agencies.

At a minimum, these figures suggest that $236 billion, or nearly half, of the supposedly “nonmilitary” spending is actually directed to sustaining the military-intelligence capabilities of American imperialism.

Adding that to the explicitly military and overseas contingency funding, the real dimensions of the US military-intelligence-police-prison complex begin to come into view: a staggering $830 billion, more than 80 cents out of every dollar in the funding bill, is devoted to killing, spying on, imprisoning or otherwise oppressing the people of the world, including the American people.

Further details of the massive legislation, weighing in at more than 1,600 pages, will undoubtedly emerge over the coming days. Among the provisions worth taking note of:

· The bill provides $3.1 billion in aid to Israel, mostly financial subsidies, and $1.45 billion in aid to Egypt, most of it military, as well as $1 billion in aid to Jordan, another US client state in the region.

· The bill eliminates the Obama administration’s Race to the Top program, used for six years to promote private charter schools and attacks on teachers in public schools. Republicans attacked the program as an effort to impose federal standards in education.

· The bill bans enforcement of a series of environmental and labor regulations, ensuring that air and water will be more polluted and workers will be more brutally exploited.

 

http://www.wsws.org/en/articles/2014/12/13/budg-d13.html

What the decline of McDonald’s really means

Death of a fast-food Goliath: 

McDonald’s is on the decline in America. Here’s why that isn’t automatically good news

Death of a fast-food Goliath: What the decline of McDonald's really means
(Credit: 1000 Words via Shutterstock/Salon)

The reign of the golden arches is ending. McDonald’s reported this week that its already-declining U.S. sales nose-dived in November, down 4.6 percent compared to last year. The company that introduced America to fast food, and has come to stand as its icon, is fading away.

It would be a mistake, however, to think that America’s falling out of love with fast food. It’s just that these days, we call it Chipotle.

There are plenty of reasons ascribed to McDonald’s downfall, but one constantly cited is that the post-”Super Size Me” world just isn’t eager to subject itself to the chain’s version of food any longer. Americans, the narrative goes, are demanding more of their meals. And the spoils are going to a new generation of “fast casual” providers who can provide they’re rising to the challenge.

McDonald’s certainly seems to believe this to be the case. While the company says it plans to pare its offerings down to the essentials, one of the new initiatives it’s spearheading in an attempt to reverse its fortunes is a sleek, touch-screen burger customization system, giving customers a greater degree of control over the options they do have — just like at Chipotle. Crucially, the company also seems to be realizing that there may be a problem with the food itself: building on its recent P.R. campaign aimed at demystifying the origins of McRibs and McNuggets, executives say they’re also considering paring down the list of ingredients in their highly processed offerings. In November, it rejected a new variety of genetically modified potato from its biggest supplier.



None of that, however, holds a candle to the image Chipotle is selling, best encompassed by “The Scarecrow.” The viral video’s success lay in its portrayal of everything it claims its food isn’t: unnatural, inhumanely raised, factory farm meat laden with God-knows-what chemicals and additives. “From the very beginning, Chipotle has used really high-quality fresh ingredients, and prepares all the foods we serve,” company spokesman Chris Arnold boasted to the AFP. ”So from the beginning, we were doing something which is pretty different than what was happening in traditional American fast food.”

Chipotle’s opened itself up to a fair amount of scrutiny from critics who say it’s overselling just how enlightened its “farm to face” fare truly is, however. Most of its food isn’t organic, and the company still uses genetically modified ingredients. While its efforts to source local, humanely raised and antibiotic-free meat are encouraging, it isn’t always able to live up to its own high (and highly advertised) standards. In some cases, customers end up paying premium fare for a product that’s more of the same.

This is a system-wide trend. Promising signs that other fast food giants are beginning to reform their ways — Panera ditching artificial additives, Chick-Fil-A eliminating chickens raised with antibiotics, Burger King removing gestation crates from its pork supply chain and switching over to cage-free eggs, In-N-Out Burger paying its workers a living wage — are laudable, but they shouldn’t be mistaken for what they are: positive P.R.-garnering baby steps toward improvement of a system that requires a total overhaul.

What Mark Bittman calls Improved Fast Food is still, after all, fast food. It still comes laden with fat, sodium and calories, often in excess of what you’ll get from McDonald’s. Even Chipotle continues to pour Coca-Cola. “Natural,” one of the new guard’s go-to adjectives, is a word with plenty of positive connotations but no FDA-enforceable definition; “humanely raised,” as a standard for livestock, is fallible at best. And I hate to break it to friends of Five Guys and Smashburger, but there’s really no such thing as a “better burger“ (or, for that matter, a better beef burrito) from a health perspective, and certainly not if you’re looking for a sustainable meal. Our growing understanding of diet’s contribution to climate change, on the contrary, holds that we’ve got to drastically cut down our consumption of meat, and of beef in particular.

But consumers don’t care how a Big Mac compares to a burrito in terms of fat and calories, according to a lengthy analysis of the company’s downfall in Fortune – they just care that Chipotle’s food is “seen as being natural, unprocessed and sustainable.” McDonald’s failing may not be that it’s so much worse than its competitors — it’s just that it’s so much worse at making itself look better. This is a food provider, after all, that’s still working to convince us that its burgers don’t contain ground-up worms.

That Americans are seeking out fresh, healthy food is unequivocally a good thing, one that’s already brought about some important reforms. But we’re still a long way away from revolution.

Lindsay Abrams is a staff writer at Salon, reporting on all things sustainable. Follow her on Twitter @readingirl, email labrams@salon.com.

http://www.salon.com/2014/12/12/why_the_end_of_mcdonalds_doesnt_mean_the_end_of_fast_food/?source=newsletter

 

 

Every sci-fi movie since Kubrick’s 1968 masterpiece has echoed the original in certain unavoidable ways

Kubrick’s indestructible influence: “Interstellar’’ joins the long tradition of borrowing from “2001’’

Kubrick's indestructible influence: "Interstellar’’ joins the long tradition of borrowing from "2001’’
“2001: A Space Odyssey” and “Interstellar” (Credit: Warner Bros./Salon)

When I first heard about Christopher Nolan’s new sci-fi adventure, “Interstellar,” my immediate thought was only this: Here comes the latest filmmaker to take on Stanley Kubrick’s “2001: A Space Odyssey.” Though it was released more than 40 years ago, ”2001″ remains the benchmark for the “serious” science fiction film: technical excellence married to thematic ambition, and a pervading sense of historic self-importance.

More specifically, I imagined that Nolan would join a long line of challengers to aim squarely at “2001’s” famous Star Gate sequence, where astronaut Dave Bowman (Keir Dullea) passes through a dazzling space-time light show and winds up at a waystation en route to his transformation from human being into the quasi-divine Star Child.

The Star Gate scene was developed by special effects pioneer Douglas Trumbull, who modernized an old technique known as slit scan photography (you can learn more about it here). While we’ve long since warp-drived our way beyond the sequence effects-wise (you can now do slit scan on your phone), the Star Gate’s eerie and propulsive quality is still powerful, because it functions as much more than just eye candy. It’s a set piece whose theme is the attempt to transcend set pieces — and character, and narrative and, most of all, the technical limitations of cinema itself.

In “2001,” the Star Gate scene is followed by another scene that also turns up frequently in sci-fi flicks. Bowman arrives at a series of strange rooms, designed in the style of Louis XVI (as interpreted by an alien intelligence), and he watches himself age and die before being reborn. Where is he? Another galaxy? Another dimension? Heaven? Hell? What are the mysterious monoliths that have brought him here? Why?

Let’s call this the Odd Room Scene. Pristine and uncanny, the odd room is the place at the end of the journey where truths of all sorts, profound and pretentious, clear and obscure, are at last revealed. In “The Matrix Reloaded,” for instance, Neo’s Odd Room Scene is his meeting with an insufferable talking suit called the Architect, where he learns the truth about the Matrix. Last summer’s “Snowpiercer,” about a train perpetually carrying the sole survivors of a new Ice Age around the world, follows the lower-class occupants of the tail car as they stage a revolution, fighting and hacking their way through first class toward the train’s engine, an Odd Room where our hero learns the truth about the train.



These final scenes in “2001″ still linger in the collective creative consciousness as inspiration or as crucible. The Star Gate and the Odd Room, particular manifestations of the journey and the revelation, have become two key architectural building blocks of modern sci-fi films. The lure to imitate and try to top these scenes, either separately or together, is apparently too powerful to resist.

Perhaps the most literal of the Star Gate-Odd Room imitators is Robert Zemeckis’s 1997 “Contact.” It’s a straightforward drama about humanity’s efforts to build a large wormhole machine whose plans have been sent by aliens, and the debate over which human should be the first to journey beyond the solar system. The prize falls to Jodie Foster’s agnostic astronomer Ellie Arroway. During the film’s Star Gate sequence, Foster rides a capsule through a wormhole that winds her around distant planets and through a newly forming star. Zemeckis’s knockoff is a decent roller coaster, but nothing more. Arroway is anxious as she goes through the wormhole, but still in control of herself; a deeply distressed Bowman, by contrast, is losing his mind.

Arroway’s wormhole deposits her in an Odd Room that looks to her (and us) like a beach lit by sunlight and moonlight. She is visited by a projection of her dead father, the aliens’ way of appearing to her in a comfortable guise, and she learns the stunning truth about … well, actually, she doesn’t learn much. Her father gives her a Paternal Alien Pep Talk. Yes, there is a lot of life out in the galaxy. No, you can’t hang out with us. No, we’re not going to answer any of your real questions. Just keep working hard down there on planet Earth; you’ll get up here eventually (as long as you all don’t kill each other first).

Brian De Palma tried his own version of the Odd Room at the end of 2000’s “Mission to Mars,” which culminates in a team of astronauts entering a cool, Kubrick-like room in an alien spaceship on Mars and, yes, learning the stunning truth about the origins of life on Earth. De Palma is a skilled practitioner of the mainstream Hollywood set piece, but you can feel the film working up quite a sweat trying and failing to answer “2001,” and early-century digital effects depicting red Martians are, to be charitable, somewhat dated.

But here comes “Interstellar.” This film would appear to be the best shot we’ve had in years to challenge the supremacy of the Star Gate, of “2001″ itself, as a Serious Sci-Fi Film About Serious Ideas. Christopher Nolan should be the perfect candidate to out-Star Gate the Star Gate. Kubrick machined his visuals to impossibly tight tolerances. Nolan (along with his screenwriter brother Jonathan) do much the same to their films’ narratives, manufacturing elaborately conceived contraptions. The film follows a Hail Mary pass to find a planet suitable for the human race as the last crops on earth begin to die out. Matthew McConaughey plays an astronaut tasked with piloting a starship through a wormhole, into another galaxy and onto a potentially habitable planet. “Interstellar” promises a straight-ahead technological realism as well as a sense of conscious “We’re pushing the envelope” ambition. (Hey, even Neil deGrasse Tyson vouches for the film’s science bonafides.) The possibilities and ambiguities of time, one of Nolan’s consistent concerns as a storyteller, is meant, I think, to be the trump card that takes “Interstellar” past “2001.”

But the film is not about fealty to, or the realistic depiction of, relativity theory. It’s about “2001.” And before it can try to usurp the throne, “Interstellar” must first kiss the ring. (And if you haven’t seen “Interstellar” yet, you might want to stop reading now.) So we get the seemingly rational crewmember who proves to be homicidal. The dangerous attempt to manually enter a spaceship. More brazenly, there’s a set piece of one ship docking with another. In “2001,” the stately docking of a spaceship with a wheel-shaped space station, turning gently above the Earth to the strains of the Blue Danube was, quite literally, a waltz, a graceful celestial courtship. It clued us in early that the machines in “2001″ would prove more lively, more human, than the humans. “Interstellar” assays the same moment, only on steroids. It turns that waltz, so rich in subtext, into a violent, vertiginous fandango as a shuttle tries to dock with a mothership that’s pirouetting out of control.

Finally, after a teasing jaunt through a wormhole earlier in the movie, we come to “Interstellar’s” Star Gate moment, as Cooper plummets into a black hole and ultimately into a library-like Odd Room that M.C. Escher might have fancied. It’s visually impressive for a moment, but its imprint quickly fades.

It’s too bad.” Interstellar” wants the stern grandeur of “2001″ and the soft-hearted empathy of Steven Spielberg, but in most respects achieves neither. Visually only a few images impress themselves in your brain — Nolan, as is often the case in his movies, is more successful designing and calibrating his story than at creating visuals worthy of his ambition. Yet the film doesn’t manage the emotional dynamics, either. It’s not for lack of trying. The Nolan brothers are rigorous scenarists, and the concept of dual father-daughter bonds being tested and reaffirmed across space-time is strong enough on the drawing board. (Presumably, familial love is sturdier than romantic love, though the film makes a half-hearted stab at the latter.)

For those with a less sentimental bent, the thematic insistence on the primacy of love might seem hokey, but it’s one way the film tries to advance beyond the chilly humanism of Kubrick toward something more warm-blooded. Besides, when measured against the stupefying vastness of the universe, what other human enterprise besides love really matters? The scale of the universe and its utter silence is almost beyond human concern, anyway.

So I don’t fault a film that suggests that it’s love more than space-age alloys and algorithms that can overcome the bounds of space and time. But the big ideas Nolan is playing with are undercut by too much exposition about what they mean. The final scene between Cooper and his elderly daughter — the triumphant, life-affirming emotional home run — is played all wrong, curt and businesslike. It’s a moment Spielberg would have handled with more aplomb; he would have had us teary-eyed, for sure, even those who might feel angry at having their heartstrings yanked so hard. This is more like having a filmmaker give a lecture on how to pull at the heartstrings without actually doing it.

Look, pulling off these Star Gate-like scenes requires an almost impossible balance. The built-in expectations in the structure of the story itself are unwieldy enough, without the association to one of science fiction’s most enduring scenes. You can make the transcendent completely abstract, like poetry, a string of visual and aural sensations, and hope viewers are in the right space to have their minds blown, but you run the risk of copping out with deliberate obfuscation. (We can level this charge at the Star Gate sequence itself.)

But it’s easy to press too far the other way — to personify the higher power or the larger force at the end of these journeys with a too literal explanation that leaves us underwhelmed. I suppose what we yearn for is just a tiny revelation, one that honors our desire for awe, preserves a larger mystery, but is not entirely inaccessible. It’s a tiny taste of the sublime. There’s an imagined pinpoint here where we would dream of transcendence as a paradox, as having God-like perception and yet still remaining human, perhaps only for a moment before crossing into something new. For viewers, though, the Star Gate scenes ultimately play on our side of that crossroads: To be human is to steal a glimpse of the transcendent, to touch it, without transcending.

While Kubrick didn’t have modern digital effects to craft his visuals with, in retrospect he had the easier time of it. It’s increasingly difficult these days to really blow an audience’s minds. We’ve seen too much. We know too much. The legitimate pleasure we can take in knowledge, in our ability to decode an ever-more-complex array of allusions and references, may not be as pleasurable or meaningful as truly seeing something beyond what we think we know.

Maybe the most successful challenger to Kubrick was Darren Aronofsky and his 2006 film “The Fountain.” The film, a meditation on mortality and immortality, plays out in three thematically-linked stories: A conquistador (Hugh Jackman) searches the new world for the biblical Tree of Life; a scientist (Jackman again) tries to save his cancer-stricken wife (Rachel Weisz), and a shaven-headed, lotus-sitting traveler (Jackman once more) journeys to a distant nebula. It’s the latter that bears the unique “2001″ imprint of journey and revelation: Jackman travels in a bubble containing the Tree of Life, through a milky and golden cosmicscape en route to his death and rebirth. It’s the Star Gate and the Odd Room all in one. Visually, Aronofsky eschewed computer-generated effects for a more organic approach that leans on fluid dynamics. I won’t tell you the film is a masterpiece — its Grand Unifying ending is more than a little inscrutable; again, pulling this stuff of is a real tightrope — but the visuals are wondrous and unsettling, perhaps the closest realization since the original of what the Star Gate sequence is designed to evoke.

Having said that, though, it may be time to turn away from the Star Gate in our quest for the mind-blowing sci-fi cinematic sequence. Filmmakers have thus far tried to imagine something like it, only better, and have mostly failed. It’s harder to imagine something beyond it, something unimaginable. Maybe future films should not be quite so literal in their chasing of those transcendent moments. This might challenge a new generation of filmmakers while also allowing the Star Gate, and “2001″ itself, to lie fallow for awhile, so we can return to it one day with fresh eyes.

It is, after all, when we least suspect it that a story may find a way past our jaded eyes and show us a glimpse of something that really does stir a moment of profound connection. There is one achingly brief moment in “Interstellar” that accomplishes this: Nolan composes a magnificent shot of a small starship, seen from a great distance gliding past Saturn’s awesome rings. The ship glitters in a gentle rhythm as it catches light of the Sun. It’s a throwaway, a transitional moment between one scene and another, merely meant to establish where we are. But its very simplicity and beauty, the power of its scale, invites us for a moment to experience the scale of the unknown and to appreciate our efforts to find a place in it, or beyond it.

http://www.salon.com/2014/11/22/kubricks_indestructible_influence_interstellar_joins_the_long_tradition_of_borrowing_from_2001/?source=newsletter