“BOYHOOD” THE MOVIE

Boyhood_film

 

I watched “Boyhood” last night. Didn’t think I could deal with a film running nearly three hours focused on the reality-based coming of age theme. I was, however, much impressed by the epic technical achievement the film represents, and I was deeply moved by the genuinely human intimacies shared throughout. The ending was a powerful insight into the human condition.

Got me to thinking about the values of the tech-fueled Bay Area where I live.

I really loath, truly hate, the materialistic, money-fueled tech culture that has enveloped San Francisco. And it’s not the technology per se. I’ve been using and building computers since 1985. It’s the disgusting excess and glorification of same.

Interestingly, watching “Boyhood” last night reminded me that there are other, more appealing, lifestyles and choices still available in the country. The main character in the film was not obsessed with tech. He questions the value of the ubiquitous smart phone. He works after school. Middle class. He doesn’t dream of going to Stanford or MIT, etc., to get a degree in CS and code. Hell, he wants to be an artist. He’s interested in the meaning of life. Like people I used to know in school and throughout my life. He represents my American Dream. Not this SF version with conspicuous consumption and phony hipster culture.

 

 

A Sucker Is Optimized Every Minute

CreditIllustration by Javier Jaén

Not long ago, our blockbuster business books spoke in unison: Trust your gut. The secret to decision-making lay outside our intellects, across the aisle in our loopy right brains, with their emo melodramas and surges of intuition. Linear thinking was suddenly the royal road to ruin. Dan Ariely’s “Predictably Irrational” tracked the extravagant illogic of our best judgment calls. The “Freakonomics” authors urged us to think like nut jobs. In “Blink,” Malcolm Gladwell counseled abandoning scientific method in favor of snap judgments. Tedious hours of research, conducted by artless cubicle drones, became the province of companies courting Chapter 11. To the artsy dropouts who could barely grasp a polynomial would go the spoils of the serial bull markets.

In 2007, when Barack Obama first visited Google’s headquarters as a candidate, he announced himself as less a torchbearer than a data connoisseur. “I am a big believer in reason and facts and evidence and science and feedback,” he told the Google crowd. “That’s what we should be doing in our government.” This was music to the company’s ears, as one of its proudest internal inventions was “A/B testing” — an optimization process, now widespread, that constantly tests design tweaks on us to see how they perform.

Dan Siroker, a Google employee, was so smitten with this rhetoric that he went to work on Obama’s campaign, creating an audience for electoral propaganda by optimizing the campaign’s “squeeze page,” which is where a site bilks visitors of their email addresses. Now Siroker is chief executive of Optimizely, a “web-testing firm” that doubles as the Oval Office for the ascendant ideology of everything-optimization.

Optimization sounds dignified and scientific, and sometimes it is, mindbogglingly so. But in lifestyle headlines it has become something less than common sense. In the last few years, The Huffington Post has doled out advice on how to “optimize” your three-day weekend, your taxes, your Twitter profile, your year-end ritual, your sex drive, your website, your wallet, your joy, your workouts, your Social Security benefits, your testosterone, your investor pitch, your news release, your to-do list and the world itself. I’m not giving away trade secrets when I reveal that, according to HuffPo, the Big Three ways to optimize your sex drive are: exercise, relax and don’t drink too much. Equally snoozy is the optimization strategy for a three-day weekend: Plan well and turn off your phone.

Like the best corporate argot, “optimize” is back-formation. Some uses seem to have derived from the Latin optimus, which the poet Horace used to mean “morally good and indifferent to trivia.” But others appear to come from “optimist” — which is rich, given that optimizers consider themselves cold-eyed realists. If they want to improve anything, it’s not America’s topsoil or fellow-man stuff but rather something more Ayn Randian: efficiency, maybe, or performance.

Optimization addresses itself not to our inner hero but to our inner bean counter. Perhaps the reason it has such appeal is that it doesn’t require Olympian talent or Randian ambition. To do it, you need only to have a computer or to evince a computer-like immunity to boredom. All of that data — the insight that, say, 70 percent of three-day weekends suffer when 10 percent less planning is done — must intrigue you. You feel better in the mornings when you take B vitamins the night before, but only when you’ve ingested dairy products at the same time? Whoa, there: Your ingenuity as an optimizer now lies in your capacity to recognize this as a data bonanza.

On the web, “optimizing” has become a fine art — and, if not a dark art, at least a dim one that has become dimmer (and finer) since Siroker did it for Obama in 2007. For years, search-engine optimization, or S.E.O., has turned web pages into Googlebait. These days, optimizers of squeeze pages, drawing lessons as much from the labcoats at Optimizely as from the big daddies at Google, recommend creating a three-to-10 minute video that’s introduced by a “magnetic headline” (“Find the Perfect Lampshade for Any Lamp”) and quickly chase it with an “information gap” like “You’re Not Going to Believe the Trick I Use While Lampshade Shopping.” (Article of faith among optimizers: humans find information gaps intolerable and will move heaven and earth to close them.) Next you get specific: “Click the play button to see me do my lampshade trick!” — after which the video unspools, only to stall at the midpoint with a virtual tollbooth. You can’t go on unless you hand over an email address. Presto.

A sucker is optimized every minute.

For optimizers, all values flatten: There’soptimal at one end and the dread suboptimalat the other. This can be freeing for those who get worked up by emotional, political or moral language. In theory, through optimization, arguments can be dispassionately adjudicated and then resolved without tears. You find Inkwell, the true black-and-white Instagram filter, beautiful? Sorry: Instagram photos filtered with the purplish monochrome Willow get way more hearts than Inkwell photos. I’m just saying. I mean, it’s just data.

Earlier this month, when Hillary Clinton defended her use of a personal email account for communications of state, she refused to acknowledge an ethical breach — only a optimizational one. “Looking back, it would have been better for me to use two separate phones and two email accounts,” she said. “I thought using one device would be simpler, and obviously, it hasn’t worked out that way.” Her choice was neither right nor wrong, then, neither honest nor sinister. It was merely, as the blog TechPresident put it, “less than optimal.”

“Of course it is sinister,” wrote Andrew Meier, the author of “The Lost Spy: An American in Stalin’s Secret Service,” via email. “It’s venomous, even. Stalin is all about optimization. Take the gulag, the greatest example, and achievement, of Soviet optimization. The lords of the gulag had charts and charts re: minimum food intake and maximum work output.”

Maximum work, minimum food. Such was optimization pre-Google. A grim application, perhaps, but we shouldn’t be surprised that the systems-obsessed Soviets possessed the will to optimize early on. In “Red Plenty,” a novelized history of the Bolshevik promise of abundance, Francis Spufford explains Moscow’s real “potato-optimizing program” of the 1960s. To get potatoes into the hands of as many Muscovites as possible and thus create the impression of agricultural bounty, a B.E.S.M. mainframe — Large Electronically Computing Machine, in English — churned through 75,000 variables, subject to 563 constraints. Spufford spells out why optimization and computers grew up together: “This problem is out of reach of fingers and slide rules. But thanks to computers, thanks to the B.E.S.M.’s inhuman patience at iterating approximate answers over and over again, it is a problem that can be solved.”

The Large Electronically Computing Machine, with its remarkable capacity for optimizing, was the direct precursor of our own data-amassing and refining machines. B.E.S.M., after all, was built on the math of Leonid Kantorovich, the economist and Nobel Laureate who is widely considered the father of linear programming. Not long after that, in the United States, George Dantzig made complementary discoveries in linear programming and developed the simplex algorithm. In 1973 Dantzig founded the Systems Optimization Laboratory at Stanford, which is now about a 12-minute drive from the Google headquarters in Mountain View, Calif. — America’s leading producer of data, algorithms and optimization.

The Apple Watch, which arrived on March 9 to the sort of popular rubbernecking that new machinery occasioned in the 1930s, is a Very Small Electronically Computing Machine with some prodigious optimizing chops. After time keeping, the watch’s chief feature is “fitness tracking”: It clocks and stores physiological data with the aim of getting you to observe and change your habits of sloth and gluttony. Evidently I wasn’t the only one whose thoughts turned to 20th-century despotism: The entrepreneur Anil Dash quipped on Twitter, albeit stretching the truth, “Not since I.B.M. sold mainframes to the Nazis has a high-tech company embraced medical data at this scale.”

And yet what attracts me to the Apple Watch are my own totalitarian tendencies. I would keep very, very close tabs on the data my body produces. How much I eat. How much I sleep. How much I exercise and accomplish. I’m feeling hopeful about this: If I watch the numbers closely and use my new tech wisely, I could really get to minimum food intake and maximum work output. Right there in my Apple Watch: a mini Gulag, optimized just for me.

Correction: March 19, 2015
An earlier version of this article erroneously cited an award for George Dantzig. He did not win a Nobel Prize. 

http://www.nytimes.com/2015/03/22/magazine/a-sucker-is-optimized-every-minute.html

DIGITAL MUSIC NEWS

RIAA: U.S. Digital Streaming Revenue

Surpassed CD Retail Sales In 2014

 

Green Dollars      The Recording Industry Association of America (RIAA) this week reported that music streaming has eclipsed the sale of physical CDs and is closing in on digital downloads as the largest source of revenue in the U.S. recorded music industry. According to RIAA figures, revenues from subscription streaming (e.g., Spotify and Rhapsody) and streaming radio services including Sirius XM hit $1.87 billion in 2014, a 29% increase vs. 2013 and equivalent to 27% per cent of total music industry revenues. CD sales slipped 12.7% to $1.85 billion. As noted by the Financial Times, downloads have been the U.S. music industry’s largest source of digital revenue for a decade, but they peaked in 2012 and have been in decline ever since. In 2014, download revenues fell 8.7% vs. 2013 to $2.58 billion, equivalent to 37% of total industry revenues.

“The music business continues to undergo a staggering transformation,” RIAA Chairman/CEO Cary Sherman said in a statement. “Record companies are now digital music firms, earning more than two-thirds of their revenues from a variety of digital formats.”

In aggregate, the various kinds of streaming outlets generated $1.87 billion, up nearly 29% from the year before – and, for the first time, slightly more than the total for CDs. That figure includes not only paid subscription outlets like Spotify, Rdio and Rhapsody, but also such internet radio services as Pandora, which does not let users pick exactly what songs they will hear, and outlets like YouTube and Spotify’s free tier, which let users pick specific songs and are generally supported by advertising.

Performance royalty fees paid by streaming radio services grew sharply from $590 million in FY:13 to $773 million last year. All physical music sales together – including CDs, vinyl and music videos – slipped below a third of the industry’s total revenues for the first time, falling from 35% in 2013 to 32% last year. Total U.S. retail revenues were flat for the fifth year in succession at $6.97 billion. 

Shift From Sales To Digital Streaming

Is Causing Growing Pains For Artists

 

     This week the RIAA reported streaming music services have begun to overtake sales of physical CDs and music downloads in revenue (see story, above), a shifting business model that has many artists seeing red in more ways than one. While advertisers, listeners, and some label execs are embracing this change, the money collected via subscription and ad-supported license fees slows to a trickle by the time it reaches the music creators. That’s one reason Taylor Swift pulled her tracks from Spotify when she released her 1989 album last fall, and other artists continue to withhold their catalogs from the service.

How worrisome is the drought caused by the shift to streaming? As AdWeek pointed out this week, Pharrell’s “Happy” arguably was the song of 2014, topping the charts in the U.S. and selling 6.45 million copies. It also was in heavy rotation on the digital radio platform Pandora, streaming 43 million times in the first quarter alone. Despite all that exposure, Sony/ATV Music Publishing says it received just $2,700 from Pandora for plays of the tune during that period, which it split with writer Pharrell Williams.

“Streaming services are going to be the major method in the way music is accessed [but] I don’t think enough money trickles down to the songwriters,” says Sony/ATV CEO Marty Bandier.

By contrast, Pandora argues that its model is justified, with CEO Brian McAndrews insisting that “we want to be an indispensable partner to music makers, and that involves paying a tremendous amount in royalties.”

Of course, AM/FM radio has never paid performance royalties to labels or artists, but the difference here is that digital streaming appears to be replacing music sales, while traditional radio served to promote it. As AdWeek asks (somewhat rhetorically), are music streaming services in as much trouble as the record business they were meant to give new life to? Or are these merely the growing pains of an emerging medium?

 

Forbes: Internet Radio Poised To

Be “Ad Opportunity Of The Future”

 

     Owners of AM/FM radio stations might not agree (or even want to read further), but internet radio has the potential to be the most ubiquitous form of media ever – and the biggest advertising opportunity of the future. That’s the word from Forbes, which this week detailed the reasons digital streaming – no longer a fledgling medium – is set to become an essential part of the targeted media mainstream. As the magazine’s David Porter points out, one third of Americans used their phones to stream music last year, and those 18-24 listened to internet radio more than terrestrial. Additionally, two of the top five most-popular apps in the U.S. (Pandora and Youtube) are used for streaming music, which increasingly is becoming personalized to individual listening tastes and experiences.

“Internet radio will need to match every part of your day,” Porter wrote in an industry analysis this week. “Imagine passively being pushed the right music that helps you wake up, motivates you to run faster, work more productively, and more. This type of personalization has already begun in advertising…[and] the barriers to this type of hyper-personal internet radio are slowly being eliminated. The swath of personal data that comprise our tastes is growing, which in turn means we are also able to better understand the tastes of similar people.”

Porter explains that, with the decreasing costs of streaming, collecting and storing large sums of data – as well as growth of powerful tools to analyze it – the ability to explore and draw inferences from this wealth of information is seemingly endless. “With the inevitable growth of internet radio, as we continue to shift digitally, the abundance of listener data will provide advertisers improved targeting, as well as awareness of their demographic,” he says. “This new paradigm will offer companies an unprecedented opportunity to connect with the right listeners at the perfect moment. The only question remaining, what will your station look like?”

 

iTunes Has Banned “Soundalikes”

Designed To Fool Paying Customers

 

     Apple’s iTunes store last month responded to a surge of “soundalikes” – cover tracks designed to mimic the original song – by aggressively banning them from the online store. As reported by Forbes writer Shawn Setaro, soundalikes are meant to fool listeners into thinking they’re the real deal, and streaming services are flooded with them. Example: The Cheer Squad’s soundalike of Katy Perry’s “California Gurls,” both of which can be found on virtually every streaming music platform.

While someone listening to a streaming service typically can skip to the next song if a soundalike comes on, iTunes customers pay for the track, which can fool consumers into buying the wrong download. Hence, iTunes has sent notices to digital distributors laying out new guidelines that ban titling songs in the search-friendly way common to soundalikes: Having the artist’s name in the song’s title, for example, and nixing phrases like “originally performed by” and “in the style of.” The guidelines called these practices “deceptive and misleading.”

Setaro says these guidelines apply only to iTunes, but they probably will affect all digital music services. Example: Such digital distributors as TuneCore put songs on all the digital music services at the same time, so if a song has one title for iTunes, it has to have the same one for all the other streaming services.

 

Sony Music Buys The Rest Of

The Orchard For $200 Million

 

     Sony Music Entertainment has purchased the remaining equity stake in the Orchard from Dimensional Associates for about $200 million. According to an SEC filing, the Orchard is currently owned by Orchard Assets Holdings, believed to be a joint venture between Dimensional Associates and Sony. According to Billboard, Sony bought what has been consistently described as a majority stake in the Orchard in March 2012. While the exact percentages of Sony’s stake have never been publicly disclosed, sources say Sony owed 51% of the company and Dimensional held the remaining 49%.

The new deal requires regulatory approvals and is expected to close after March 31, 2015. In addition to the Orchard, Sony Music Entertainment also owns RED, widely considered to be one of the biggest indie U.S. distributors.

The Orchard was founded by songwriter/producer Richard Gottehrer and digital music executive Scott Cohen in 1997, and currently has annual revenues of $200 million. 

Rhapsody Subscribers Now Can Share

Music Tracks With Twitter Followers

 

     Rhapsody this week announced its subscribers now can share songs from with their Twitter followers, who will be able to play them in full without leaving the social media site – even if they don’t pay for a Rhapsody subscription. Rhapsody claims it’s the first streaming music service to offer full-track playback on Twitter, using its Audio Cards feature. The music is fully licensed by Rhapsody, with a percentage of revenue going to artists, labels, and publishers.

Rhapsody says one reason for the approach – which will be available only in the U.S. – is “to reinforce that music isn’t free.” The streaming music company is testing the feature to see if it can recoup the licensing fees, over time, by converting the Twitter exposure into new subscribers for its $9.99/month music service. “It’s going to be a huge experiment in how we make music social again,” Rhapsody CFO Ethan Rudin said in an interview with Geek Wire. “We’re extraordinarily confident in the success we’re going to have in converting people to loyal Rhapsody subscribers. If there is the opportunity to fine-tune and make sure this is economically viable in perpetuity, we want to have the proof points to get it right.”

Rudin says Rhapsody will collect audience data and share the results with its industry partners in an effort to find an approach that works. 

 

A publication of Bunzel Media Resources © 2015

Is a New Political System Emerging in This Country?

1-OUuIb58lPNAkTqcgYyZ9Mg

This article originally appeared at TomDispatch.com.

The New American Order
1% Elections, The Privatization of the State, a Fourth Branch of Government, and the Demobilization of “We the People”

By Tom Engelhardt

Have you ever undertaken some task you felt less than qualified for, but knew that someone needed to do? Consider this piece my version of that, and let me put what I do understand about it in a nutshell: based on developments in our post-9/11 world, we could be watching the birth of a new American political system and way of governing for which, as yet, we have no name.

And here’s what I find strange: the evidence of this, however inchoate, is all around us and yet it’s as if we can’t bear to take it in or make sense of it or even say that it might be so.

Let me make my case, however minimally, based on five areas in which at least the faint outlines of that new system seem to be emerging: political campaigns and elections; the privatization of Washington through the marriage of the corporation and the state; the de-legitimization of our traditional system of governance; the empowerment of the national security state as an untouchable fourth branch of government; and the demobilization of “we the people.”

Whatever this may add up to, it seems to be based, at least in part, on the increasing concentration of wealth and power in a new plutocratic class and in that ever-expanding national security state. Certainly, something out of the ordinary is underway, and yet its birth pangs, while widely reported, are generally categorized as aspects of an exceedingly familiar American system somewhat in disarray.

1. 1% Elections

Check out the news about the 2016 presidential election and you’ll quickly feel a sense of been-there, done-that. As a start, the two names most associated with it, Bush and Clinton, couldn’t be more familiar, highlighting as they do the curiously dynastic quality of recent presidential contests. (If a Bush or Clinton should win in 2016 and again in 2020, a member of one of those families will have controlled the presidency for 28 of the last 36years.)

Take, for instance, “Why 2016 Is Likely to Become a Close Race,” a recent piece Nate Cohn wrote for my hometown paper. A noted election statistician, Cohn points out that, despite Hillary Clinton’s historically staggering lead in Democratic primary polls (and lack of serious challengers), she could lose the general election. He bases this on what we know about her polling popularity from the Monica Lewinsky moment of the 1990s to the present. Cohn assures readers that Hillary will not “be a Democratic Eisenhower, a popular, senior statesperson who cruises to an easy victory.” It’s the sort of comparison that offers a certain implicit reassurance about the near future. (No, Virginia, we haven’t left the world of politics in which former general and president Dwight D. Eisenhower can still be a touchstone.)

Cohn may be right when it comes to Hillary’s electability, but this is not Dwight D. Eisenhower’s or even Al Gore’s America. If you want a measure of that, consider this year’s primaries. I mean, of course, the 2015 ones. Once upon a time, the campaign season started with candidates flocking to Iowa and New Hampshire early in the election year to establish their bona fides among party voters. These days, however, those are already late primaries.

The early primaries, the ones that count, take place among a small group of millionaires and billionaires, a new caste flush with cash who will personally, or through complex networks of funders, pour multi-millions of dollars into the campaigns of candidates of their choice. So the early primaries — this year mainly a Republican affair — are taking place in resort spots like Las Vegas, Rancho Mirage, California, and Sea Island, Georgia, as has beenwidely reported. These “contests” involve groveling politicians appearing at the beck and call of the rich and powerful, and so reflect our new 1% electoral system. (The main pro-Hillary super PAC, for instance, is aiming for a kitty of $500 million heading into 2016, while the Koch brothers network has already promised to drop almost $1 billion into the coming campaign season, doubling their efforts in the last presidential election year.)

Ever since the Supreme Court opened up the ultimate floodgates with its 2010 Citizens United decision, each subsequent election has seen record-breaking amounts of money donated and spent. The 2012 presidential campaign was the first $2 billion election; campaign 2016 is expected to hitthe $5 billion mark without breaking a sweat. By comparison, according to Burton Abrams and Russell Settle in their study, “The Effect of Broadcasting on Political Campaign Spending,” Republicans and Democrats spent just under $13 million combined in 1956 when Eisenhower won his second term.

In the meantime, it’s still true that the 2016 primaries will involve actual voters, as will the election that follows. The previous election season, the midterms of 2014, cost almost $4 billion, a record despite the number of small donors continuing to drop. It also represented the lowest midterm voter turnout since World War II. (See: demobilization of the public, below — and add in the demobilization of the Democrats as a real party, the breaking of organized labor, the fragmenting of the Republican Party, and the return of voter suppression laws visibly meant to limit the franchise.) It hardly matters just what the flood of new money does in such elections, when you can feel the weight of inequality bearing down on the whole process in a way that is pushing us somewhere new.

2. The Privatization of the State (or the U.S. as a Prospective Third-World Nation)

In the recent coverage of the Hillary Clinton email flap, you can find endless references to the Clintons of yore in wink-wink, you-know-how-they-are-style reporting; and yes, she did delete a lot of emails; and yes, it’s an election year coming and, as everyone points out, the Republicans are going to do their best to keep the email issue alive until hell freezes over, etc., etc. Again, the coverage, while eyeball gluing, is in a you’ve-seen-it-all-before, you’ll-see-it-all-again-mode.

However, you haven’t seen it all before. The most striking aspect of this little brouhaha lies in what’s most obvious but least highlighted. An American secretary of state chose to set up her own private, safeguarded email system for doing government work; that is, she chose to privatize her communications. If this were Cairo, it might not warrant a second thought. But it didn’t happen in some third-world state. It was the act of a key official of the planet’s reigning (or thrashing) superpower, which — even if it wasn’tthe first time such a thing had ever occurred — should be taken as a tiny symptom of something that couldn’t be larger or, in the long stretch of history, newer: the ongoing privatization of the American state, or at least the national security part of it.

Though the marriage of the state and the corporation has a pre-history, the full-scale arrival of the warrior corporation only occurred after 9/11. Someday, that will undoubtedly be seen as a seminal moment in the formation of whatever may be coming in this country. Only 13 years later, there is no part of the war state that has not experienced major forms of privatization. The U.S. military could no longer go to war without its crony corporations doing KP and guard duty, delivering the mail, building the bases, and being involved in just about all of its activities, including trainingthe militaries of foreign allies and even fighting. Such warrior corporations are now involved in every aspect of the national security state, includingtorture, drone strikes, and — to the tune of hundreds of thousands of contract employees like Edward Snowden — intelligence gathering and spying. You name it and, in these years, it’s been at least partly privatized.

All you have to do is read reporter James Risen’s recent book, Pay Any Price, on how the global war on terror was fought in Washington, and you know that privatization has brought something else with it: corruption, scams, and the gaming of the system for profits of a sort that might normally be associated with a typical third-world kleptocracy. And all of this, a new world being born, was reflected in a tiny way in Hillary Clinton’s very personal decision about her emails.

Though it’s a subject I know so much less about, this kind of privatization (and the corruption that goes with it) is undoubtedly underway in the non-war-making, non-security-projecting part of the American state as well.

3. The De-legitimization of Congress and the Presidency

On a third front, American “confidence” in the three classic check-and-balance branches of government, as measured by polling outfits, continues to fall. In 2014, Americans expressing a “great deal of confidence” in the Supreme Court hit a new low of 23%; in the presidency, it was 11%, and in Congress a bottom-scraping 5%. (The military, on the other hand, registers at 50%.) The figures for “hardly any confidence at all” are respectively 20%, 44%, and more than 50%. All are in or near record-breaking territory for the last four decades.

It seems fair to say that in recent years Congress has been engaged in a process of delegitimizing itself. Where that body once had the genuine power to declare war, for example, it is now “debating” in a desultory fashion an “authorization” for a war against the Islamic State in Syria, Iraq, and possibly elsewhere that has already been underway for eight months and whose course, it seems, will be essentially unaltered, whether Congress authorizes it or not.

What would President Harry Truman, who once famously ran a presidential campaign against a “do-nothing” Congress, have to say about a body that truly can do just about nothing? Or rather, to give the Republican war hawks in that new Congress their due, not quite nothing. They are proving capable of acting effectively to delegitimize the presidency as well. House Majority Leader John Boehner’s invitation to Israeli Prime Minister Benjamin Netanyahu to undercut the president’s Iranian nuclear negotiations and theletter signed by 47 Republican senators and directed to the Iranian ayatollahs are striking examples of this. They are visibly meant to tear down an “imperial presidency” that Republicans gloried in not so long ago.

The radical nature of that letter, not as an act of state but of its de-legitimization, was noted even in Iran, where fundamentalist Supreme Leader Ali Khamenei proclaimed it “a sign of a decline in political ethics and the destruction of the American establishment from within.” Here, however, the letter is either being covered as a singularly extreme one-off act (“treason!”) or, as Jon Stewart did on “The Daily Show,” as part of arepetitive tit-for-tat between Democrats and Republicans over who controls foreign policy. It is, in fact, neither. It represents part of a growing pattern in which Congress becomes an ever less effective body, except in its willingness to take on and potentially take out the presidency.

In the twenty-first century, all that “small government” Republicans and “big government” Democrats can agree on is offering essentially unconditional support to the military and the national security state. The Republican Party — its various factions increasingly at each other’s throats almost as often as at those of the Democrats — seems reasonably united solely on issues of war-making and security. As for the Democrats, an unpopular administration, facing constant attack by those who loath President Obama, has kept its footing in part by allying with and fusing with the national security state. A president who came into office rejecting torture and promoting sunshine and transparency in government has, in the course of six-plus years, come to identify himself almost totally with the U.S. military, the CIA, the NSA, and the like. While it has launched anunprecedented campaign against whistleblowers and leakers (as well as sunshine and transparency), the Obama White House has proved a powerful enabler of, but also remarkably dependent upon, that state-within-a-state, a strange fate for “the imperial presidency.”

4. The Rise of the National Security State as the Fourth Branch of Government

One “branch” of government is, however, visibly on the rise and rapidly gaining independence from just about any kind of oversight. Its ability to enact its wishes with almost no opposition in Washington is a striking feature of our moment. But while the symptoms of this process are regularly reported, the overall phenomenon — the creation of a de facto fourth branch of government — gets remarkably little attention. In the war on terror era, the national security state has come into its own. Its growth has been phenomenal. Though it’s seldom pointed out, it should be considered remarkable that in this period we gained a second full-scale “defense department,” the Department of Homeland Security, and that it and the Pentagon have become even more entrenched, each surrounded by its own growing “complex” of private corporations, lobbyists, and allied politicians. The militarization of the country has, in these years, proceeded apace.

Meanwhile, the duplication to be found in the U.S. Intelligence Community with its 17 major agencies and outfits is staggering. Its growing ability to surveil and spy on a global scale, including on its own citizens, puts the totalitarian states of the twentieth century to shame. That the various parts of the national security state can act in just about any fashion without fear of accountability in a court of law is by now too obvious to belabor. As wealth has traveled upwards in American society in ways not seen since the first Gilded Age, so taxpayer dollars have migrated into the national security state in an almost plutocratic fashion.

New reports regularly surface about the further activities of parts of that state. In recent weeks, for instance, we learned from Jeremy Scahill and Josh Begley of the Intercept that the CIA has spent years trying to break the encryption on Apple iPhones and iPads; it has, that is, been aggressively seeking to attack an all-American corporation (even if significant parts of its production process are actually in China). Meanwhile, Devlin Barrett of theWall Street Journal reported that the CIA, an agency barred from domestic spying operations of any sort, has been helping the U.S. Marshals Service (part of the Justice Department) create an airborne digital dragnet on American cell phones. Planes flying out of five U.S. cities carry a form of technology that “mimics a cellphone tower.” This technology, developed and tested in distant American war zones and now brought to “the homeland,” is just part of the ongoing militarization of the country from its borders to itspolice forces. And there’s hardly been a week since Edward Snowden first released crucial NSA documents in June 2013 when such “advances” haven’t been in the news.

News also regularly bubbles up about the further expansion, reorganization, and upgrading of parts of the intelligence world, the sorts of reports that have become the barely noticed background hum of our lives. Recently, for instance, Director John Brennan announced a major reorganization of the CIA meant to break down the classic separation between spies and analysts at the Agency, while creating a new Directorate of Digital Innovation responsible for, among other things, cyberwarfare and cyberespionage. At about the same time, according to the New York Times, the Center for Strategic Counterterrorism Communications, an obscure State Department agency, was given a new and expansive role in coordinating “all the existing attempts at countermessaging [against online propaganda by terror outfits like the Islamic State] by much larger federal departments, including the Pentagon, Homeland Security and intelligence agencies.”

This sort of thing is par for the course in an era in which the national security state has only grown stronger, endlessly elaborating, duplicating, and overlapping the various parts of its increasingly labyrinthine structure. And keep in mind that, in a structure that has fought hard to keep what it’s doing cloaked in secrecy, there is so much more that we don’t know. Still, we should know enough to realize that this ongoing process reflects something new in our American world (even if no one cares to notice).

5. The Demobilization of the American People

In The Age of Acquiescence, a new book about America’s two Gilded Ages, Steve Fraser asks why it was that, in the nineteenth century, another period of plutocratic excesses, concentration of wealth and inequality, buying of politicians, and attempts to demobilize the public, Americans took to the streets with such determination and in remarkable numbers over long periods of time to protest their treatment, and stayed there even when the brute power of the state was called out against them. In our own moment, Fraser wonders, why has the silence of the public in the face of similar developments been so striking?

After all, a grim new American system is arising before our eyes. Everything we once learned in the civics textbooks of our childhoods about how our government works now seems askew, while the growth of poverty, the flatlining of wages, the rise of the .01%, the collapse of labor, and the militarization of society are all evident.

The process of demobilizing the public certainly began with the military. It was initially a response to the disruptive and rebellious draftees of the Vietnam-era. In 1973, at the stroke of a presidential pen, the citizen’s army was declared no more, the raising of new recruits was turned over to advertising agencies (a preview of the privatization of the state to come), and the public was sent home, never again to meddle in military affairs. Since 2001, that form of demobilization has been etched in stone andtransformed into a way of life in the name of the “safety” and “security” of the public.

Since then, “we the people” have made ourselves felt in only three disparate ways: from the left in the Occupy movement, which, with its slogans about the 1% and the 99%, put the issue of growing economic inequality on the map of American consciousness; from the right, in the Tea Party movement, a complex expression of discontent backed and at least partially funded by right-wing operatives and billionaires, and aimed at the de-legitimization of the “nanny state”; and the recent round of post-Ferguson protests spurred at least in part by the militarization of the police in black and brown communities around the country.

The Birth of a New System

Otherwise, a moment of increasing extremity has also been a moment of — to use Fraser’s word — “acquiescence.” Someday, we’ll assumedly understand far better how this all came to be. In the meantime, let me be as clear as I can be about something that seems murky indeed: this period doesn’t represent a version, no matter how perverse or extreme, of politics as usual; nor is the 2016 campaign an election as usual; nor are we experiencing Washington as usual. Put together our 1% elections, the privatization of our government, the de-legitimization of Congress and the presidency, as well as the empowerment of the national security state and the U.S. military, and add in the demobilization of the American public (in the name of protecting us from terrorism), and you have something like a new ballgame.

While significant planning has been involved in all of this, there may be no ruling pattern or design. Much of it may be happening in a purely seat-of-the-pants fashion. In response, there has been no urge to officially declare that something new is afoot, let alone convene a new constitutional convention. Still, don’t for a second think that the American political system isn’t being rewritten on the run by interested parties in Congress, our present crop of billionaires, corporate interests, lobbyists, the Pentagon, and the officials of the national security state.

Out of the chaos of this prolonged moment and inside the shell of the old system, a new culture, a new kind of politics, a new kind of governance is being born right before our eyes. Call it what you want. But call it something. Stop pretending it’s not happening.

Tom Engelhardt is a co-founder of the American Empire Project and the author ofThe United States of Fear as well as a history of the Cold War, The End of Victory Culture. He is a fellow of the Nation Institute and runsTomDispatch.com. His latest book is Shadow Government: Surveillance, Secret Wars, and a Global Security State in a Single-Superpower World(Haymarket Books).

[Note: My special thanks go to my friend John Cobb, who talked me through this one. Doing it would have been inconceivable without him. Tom]

Copyright 2015 Tom Engelhardt

https://medium.com/@TomDispatch/engelhardt-is-a-new-political-system-emerging-in-this-country-fbfa0acbe185

America is becoming more like the illiberal pseudo-democracies and kleptocracies.

Can American Democracy Survive Against Rising Political Corruption and Privatization?

images

In 1932, on the eve of FDR’s presidency, Benito Mussolini proclaimed, “The liberal state is destined to perish.” He added, all too accurately, “All the political experiments of our day are anti-liberal.”

The democracies were doomed, Il Duce declared, because they could not solve crucial problems. Unlike the dictatorships, which were willing to forcefully use a strong state, the democracies could not fix their broken economies. Parliamentary systems were hamstrung politically. The democracies were also war-weary, conflict-averse, and ill-prepared to fight. The fascists, unlike the democracies, had solved the problem of who was part of the community.

Mussolini’s ally, Adolf Hitler, was further contemptuous of “mongrelization” in American democracy. Who was an American? How did immigrants fit in? What about Negroes? The fascist states, by contrast, rallied their citizens to a common vision and a common purpose. Hitler was quite confident that he knew who was a German and who was not. To prove it, he fashioned the Nuremberg laws; he annexed German-speaking regions of his neighbors. As Hitler infamously put it, Ein Volk, ein Reich, ein Fuehrer.

Though he was a buffoonish dictator, Il Duce was not such a bad political scientist. In the 1930s, a lot of liberal democrats wondered the same thing, and for the same reasons. As Ira Katznelson wrote in Fear Itself: “Such beliefs and opinions were not limited to dictators and dictatorships. As Roosevelt prepared to speak [in his first inaugural], skepticism was prevalent about whether representative parliamentary democracies could cope within their liberal constitutional bounds with capitalism’s utter collapse, the manifest military ambitions by the dictatorships, or international politics characterized by ultranationalist territorial demands. Hesitation, alarm, and democratic exhaustion were widespread.”

The democracies did survive, of course, and they flourished. The New Deal got us halfway out of the Great Depression, and the war buildup did the rest. Fascism was defeated, militarily and ideologically. The collapse of Soviet communism took another half-century. Thanks to the wisdom of containment, Stalinism fell of its own weight, as both an economic and political failure.

Not only did the democracies endure—by the 1980s, America had broadened the inclusiveness of its polity. Europe had embarked on a bold experiment toward continental democracy. In the final days of communism, there was triumphalism in the West. Francis Fukuyama even proclaimed, incautiously, in his 1989 essay, “The End of History?” that all societies were necessarily gravitating toward capitalism and democracy, two ideals that were supposedly linked.

***

Today, it is Mussolini’s words that resonate. Once again, the democracies are having grave difficulty pulling their economies out of a prolonged economic slump. Once again, they are suffering from parliamentary deadlock and loss of faith in democratic institutions. The American version reflects a radically obstructionist Republican Party taking advantage of constitutional provisions that Madison (and Obama) imagined as promoting compromise; instead, the result is deadlock. The European variant is enfeebled by the multiple veto points of a flawed European Union unable to pursue anything but crippling austerity. Once again, several anti-liberal alternatives are on the march. “All the political experiments of our day are anti-liberal.”

Take a tour of the horizon. Mussolini would not be surprised. The fastest-growing economy, China’s, is nothing if not anti-liberal, and getting steadily more adroit at suppressing liberal aspirations. The Beijing regime, which has learned the virtues of patience since Tiananmen, waited out the Hong Kong protests and efficiently shut them down. The Hong Kong elections of 2017 will be limited to candidates approved by the communist regime on the mainland. Capitalism was supposed to bring with it democracy and rule of law. But the Chinese have been superbly effective at combining dynamic state-led capitalism with one-party rule.

What unites regimes as dissimilar as Iran, Turkey, Hungary, Egypt, Venezuela, and Russia is that they combine some of the outward forms of democracy with illiberal rule. The press is not truly free, but is mostly a tool of the government. Editors and journalists are in personal danger of disappearing. There are elections, but  the opposition somehow doesn’t get to come to power. Minority religion and ethnic groups are repressed, sometimes subtly, sometimes brutally. Dissidents, even if they break no laws, risk life and limb. The regimes in these nations have varying degrees of corruption between the state and economic oligarchs, which helps keep both in power. In Hungary, a member of the E.U., which is a union of liberal democracies, Prime Minister Viktor Orbán has expressly invoked the ideal of an illiberal state. In Turkey, Recep Tayyip Erdogan has dramatically increased enrollments in state-supported religious schools and automatically assigned some children to them, against the wishes of their secular parents.

Turkey is a stalwart member of NATO. Elsewhere in the Middle East, our closest allies don’t even go through the motions of democracy; they are proud monarchies. Israel, our most intimate friend in the region, is becoming less of a democracy almost daily. Israelis are seriously debating whether to formally sacrifice elements of democracy for Jewish identity. And this tally doesn’t even include the flagrant tyrannies such as the insurgency that calls itself the Islamic State, or ISIL. All the political experiments of our day are anti-liberal.

Ironically, some liberals are pinning great hopes on recent stirrings in a venerable institution of hierarchy, autocracy, secrecy, and privilege that has been the antithesis of liberal for nearly two millennia—the Catholic Church, now under a reformist pope. One has to wish Francis well and hope that his new openness extends to the entire institution, but these reforms are fragile. It has been a few centuries since the Church murdered its rivals, but in my lifetime the Church was very cozy with fascists.

One of the great inventions of liberal democracy was the concept of a loyal opposition. You could oppose the government without being considered treasonous. A leader, conversely, could be tossed out of office by the electorate without fearing imprisonment or execution by successors. In much of the world, this ideal now seems almost quaint, and certainly imprudent. A corrupt or dictatorial regime has much to fear from displacement, including jail and even death at the hands of an opposition in power.

There are a few bright spots. Some of Africa has managed to have roughly free and fair elections. South Africa’s young democracy is fragile, but seems to be holding. Some of the Pacific Rim is moving in the direction of genuine democracy. Many former Soviet satellites in Eastern Europe are functioning democracies, even liberal ones. And democratic aspiration is far from dead, as events in Ukraine show. Latin America has more democratically elected governments than it has had in a generation, but it also has several nominal democracies that are illiberal, or prone to coups, or simply corrupt. Mexico, our close NAFTA partner, epitomizes illiberal democracy.

***

But it is the democratic heartland, Europe and North America, that presents the most cause for dismay. Rather than the United States serving as a beacon to inspire repressed peoples seeking true liberal democracy, America is becoming more like the illiberal pseudo-democracies and kleptocracies. A dispassionate review of what is occurring in our own country has to include deliberate suppression of the right to vote; ever more cynical manipulation of voting districts in the nation that invented gerrymandering; the deepening displacement of citizenship with money and rise of plutocracy; the corruption of the regulatory process; a steep decline in public confidence in government and in democracy itself; and a concomitant doubt that democratic participation is worth the trouble. In my piece in this issue’s special report, I address some of these questions in the context of markets versus government, but the challenge goes much deeper.

Obstruction feeds public cynicism about government. Though the mischief and refusal to compromise are mostly one-sided—it is hard to recall a Democratic president more genuinely eager to accommodate the opposition than Barack Obama—the resulting deadlock erodes confidence in democracy and government in general. Why can’t these people just get along and work for the common good? Democrats, as the party that believes in government, take the blame more than Republicans. Government’s failure to address festering, complex problems feeds the dynamic.

This is all the more alarming because the challenges ahead will require strong government and above all legitimate government. At best, global climate change and sea level rise will require public coordination and some personal dislocation. Transition to a sustainable economy demands far more intensive public measures, as well as public trust in the hope that changes in old habits of carbon energy use need not result in reduced living standards. The risk of epidemics such as Ebola will require more effective government to coordinate responses that the private sector can’t manage. The popular frustration with flat or declining earnings for all but the top demands more government intervention. Weak government can’t accomplish any of this. Mussolini’s taunt burns: The liberal democracies are incapable of solving national problems.

A generation ago, political scientists coined a useful phrase—strong democracy. The Prospect published some pieces making this case, by authors like Benjamin Barber. Others, such as Jane Mansbridge and James Fishkin, writing in the same spirit, called for more participatory democracy. The common theme was that democracy needed to be re-energized, with more citizen involvement, more direct deliberation. What has happened is the reverse. The combination of economic stresses, the allure of other entertainments, the rise of the Internet as a venue for more social interchange but less civic renewal, has left democracy weaker when it needs to be stronger.

The other contention of the fascists—that the democracies had trouble with the vexing questions of community and membership—was never more of a challenge. In Europe, the poisonous mix of high unemployment, anxiety about terrorism, and influx of refugees and immigrants is feeding a vicious nationalist backlash and nurturing the far right. At home, the failure to normalize the status of an estimated 12 million immigrants lacking proper documents deprives large numbers of residents of normal rights and stokes nativism. Assaults on voting rights even for citizens, coupled with physical assaults by police, make African Americans less than full members of the democracy, despite the civil rights revolution of half a century ago.

Mussolini’s other taunt was that the liberal democracies were too divided and war-weary to fight. When Hitler remilitarized the Rhineland in March 1936, in defiance of the Treaty of Versailles, the democracies did nothing. They dithered right up until Germany’s invasion of Poland in September 1939. As late as 1940, Roosevelt was more eager to keep America out of another European war than to help the British make a stand against the Nazis.

The military challenge today is more complex. America in this century has vacillated between grandiosity and timidity. It fought the wrong war in Iraq, and then may have pulled out prematurely. The administration has been weak and divided in its policies toward Syria and ISIL. To some extent this is understandable; these are hydra-headed threats, with no easy solutions. If President Obama is ambivalent, the public is even more so. Yet the greatest military threats to American democracy are not the risks of invasion or terrorist assault, but what we are doing to ourselves. The Obama administration, like that of George W. Bush, has been all too willing to subordinate liberty to security, secrecy, and autocracy, even in cases where these objectives are not in direct contention.

The risk is not that American democracy will abruptly “perish,” but that it will be slowly denuded of its vital content. If we are to reverse the appeal of anti-liberal society globally, we have to repair our democracy at home. The challenge is multifaceted, and will take time. It should be the great project of the next president and the ongoing work of the citizenry.

Robert Kuttner is the former co-editor of the American Prospect and a senior fellow at Demos. His latest book is “Obama’s Challenge: America’s Economic Crisis and the Power of a Transformative Presidency.”

 

http://www.alternet.org/news-amp-politics/can-american-democracy-survive-against-rising-political-corruption-and?akid=12900.265072.Crh72K&rd=1&src=newsletter1033428&t=11

Welcome to ‘Libertarian Island': Inside the Frightening Economic Dreams of Silicon Valley’s Super Rich

Jeff Bezos

Ayn Rand, Peter Thiel, Rand Paul (Credit: AP/Reuters/Fred Prouser/Charles Dharapak/Photo montage by Salon)

The idea that we are all in it together is foreign to the tech billionaires.

In the clever science fiction video game Bioshock, an Objectivist business magnate named Andrew Ryan (recognize those initials?) creates an underwater city, where the world’s elite members can flourish free from the controls of government. It is a utopian village that Ayn Rand and her hero John Galt would surely approve of, but unfortunately it ends up becoming a dystopian nightmare after class distinctions form (what a shocker) and technological innovation gets out of hand. It was a hell of a video game, for those of you into that kind of thing.

But I don’t bring up Bioshock to talk about video games. I bring it up because there is currently a similar movement happening in real life, and it is being funded by another rather eccentric businessman, the Paypal billionaire Peter Thiel. As some may already know, Thiel has teamed up with the grandson of libertarian icon Milton Friedman, Patri Friedman, to try and develop a “seastead,” or a permanent and autonomous dwelling at sea. Friedman formed the “Seasteading Institute” in 2008, and Thiel has donated more than a million dollars to fund its creation.

It is all very utopian, to say the least. But on the website, they claim a floating city could be just years away. The real trick is finding a proper location to build this twenty-first century atlantis. Currently, they are attempting to find a host nation that will allow the floating city somewhat close to land, for the calm waters and ability to easily travel to and from the seastead.

The project has been coined “libertarian island,” and it reveals a building movement within Silicon Valley; a sort of free market techno-capitalist faction that seems to come right out of Ayn Rand’s imagination. And as with all utopian ideologies, it is very appealing, especially when you live in a land where everything seems possible, with the proper technological advancements.

Tech billionaires like Thiel, Travis Kalanick and Marc Andressen, are leading the libertarian revolution in the land of computers, and it is not a surprising place for this laissez faire ideology to flourish. Silicon Valley is generally considered to have a laid back Californian culture, but behind all of the polite cordialities, there rests a necessary cutthroat attitude. A perfect example of this was Steve Jobs, who was so revered by the community, and much of the world, yet almost psychopathically merciless. The recent anti-trust case against the big tech companies like Google, Apple, and Intel, who colluded not to recruit each others employees, has even lead to speculation as to whether Jobs should be in jail today, if he were still alive.

So while Silicon Valley is no doubt a socially progressive place (i.e. gay marriage), if one looks past social beliefs, there is as much ruthlessness as you’d expect in any capitalist industry. Look at the offshore tax avoidance, the despicable overseas working conditions, the outright violations of privacy and illegal behavior. There is a very real arrogance within Silicon Valley that seems to care little about rules and regulations.

Libertarianism preaches a night-watchmen government that stays out of businesses way, and allows private industries to regulate themselves. It is a utopian ideology, as was communism, that has an almost religious-like faith in the free market, and an absolute distrust of any government. It is a perfect philosophy for a large corporation, like Apple, Google or Facebook. If we lived in an ideal libertarian society, these companies would not have to avoid taxes, because they would be non-existent, and they wouldn’t have to worry about annoying restrictions on privacy. In a libertarian society, these companies could regulate their own actions, and surely Google, with their famous “Don’t be evil” slogan, believes in corporate altruism.

In the Valley, innovation and entrepreneurship is everything, so a blind faith in the market is hardly shocking. And last year one of the leading libertarians, Rand Paul, flew out to San Francisco to speak at the Lincoln Labs Reboot Conference, held to “create and support a community of like-minded individuals who desire to advance liberty in the public square with the use of technology.” Paul said at the conference, “use your ingenuity, use your big head to think of solutions the marketplace can figure out, that the idiots and trolls in Washington will never come up with,” surely earning laughs and pats on the back.

Rand Paul has had one on one meetings with Mark Zuckerberg, and the floating island billionaire himself, Peter Thiel. The founder and CEO of Uber, Travis Kalanick is another noted libertarian, who used to have the cover of Ayn Rand’s “The Fountainhead” as his twitter icon. Kalanick runs Uber just as a devoted follower of Ayn Rand would, continuously fighting regulators and living by what writer Paul Carr has called the “cult of disruption.” Carr nicely summarizes the philosophy of this cult: “In a digitally connected age, there’s absolutely no need for public carriage laws (or hotel laws, or food safety laws, or… or…) because the market will quickly move to drive out bad actors. If an Uber driver behaves badly, his low star rating will soon push him out of business.”

So basically, with the internet, regulation has become nothing more than a outdated relic of the past, and today consumers truly have the power to make corporations behave by speaking out on social media, or providing negative ratings on Yelp, or filing a petition on Change.com, etc. It is the same old libertarian argument wrapped up in a new millennial cloak, that corporations will act ethically because if they don’t, consumers will go elsewhere.

As usual, it leaves out important realities that don’t sit well with the self-regulation myth. These realities include the irrationality and apathy of consumers, the lack of information available to consumers, and the overall secretive nature of corporations. The problem with self-regulation is that, consumers do not know what goes on at a corporation behind closed doors, so how would they force a company to act ethically if they are not aware of their misdeeds. Had the government not gone after Google for privacy violations, users would have never known. Google and other tech companies have a constant crave for innovation over everything, and bypass things like privacy when they get in its way. Would they control themselves had the government not stepped in?

Another important truth is that many consumers usually continue willfully using products, even if a company has done something that is contrary to their moral beliefs. It is a sort of hypocritical selfishness where one puts comfort or convenience over ethics. Just look at Apple: everyone is aware of the appalling factory conditions and the tax avoidance, but that doesn’t stop many people from buying the latest iPhone.

When looking at other industries, like oil and gas, the myth of self-regulation is even more comical. The famous oil billionaire Koch brothers, who are also fanatic libertarians,  have knowingly avoided regulations, and have hurt people in the process. During the nineties, they were particularly careless, and the bottom line influenced every decision. When pipelines were in bad shape, they would determine whether fixing them or leaving them, and possibly paying off a lawsuit in the future, was more profitable. In 1996, a pipeline that had been given the second treatment leaked butane into the air, and killed two teenagers who ignited it with the spark of their car ignition.

Even if the consumers were completely rational and had access to all information, would it really be worth it to wait for companies to abide? For example, many libertarians argue that legislation that made seat belts and airbags mandatory in all vehicles was pointless, because the free market would have eventually brought them anyways. But even if this were true, how long would it take, and how many lives could this inaction have caused?

The most damning evidence against the myth of self regulation may very well be history. Before government regulatory agencies like the FDA came around, the safety of workers and consumers were both constantly at stake, as muckrakers like Upton Sinclair described so vividly. More recently, the lack of regulation in the financial industry, particularly in derivatives, contributed to one of the worst economic crises in history, and hurt many people in the process.

Libertarians are uninterested in these realities, and believe that all government intervention is useless and stifles innovation, and it is the “cult of innovation” that makes the libertarian philosophy particularly popular in the technology obsessed Silicon Valley. In the their world, innovation is more important than privacy or safety, and the best and brightest should not have to play by the rules.

While overall, Silicon Valley still supports the Democrats over Republicans, it would not be surprising to see a shift in the coming years. The libertarian philosophy is very attractive to those who worship technology and entrepreneurship, which is nearly all of the techies. And with millions of potential campaign dollars coming out of the valley, it could very well be a problematic territory for liberals in the future.

 

http://www.alternet.org/news-amp-politics/welcome-libertarian-island-inside-frightening-economic-dreams-silicon-valleys?akid=12898.265072._0WWy9&rd=1&src=newsletter1033376&t=9

Paul Krugman: Netanyahu Has Used Iran to Conceal Israel’s Economic Disaster

netanyahu1

Israel’s prime minister is trying the hide a big problem.

The real reason behind Israeli Prime Minister Benjamin Netanyahu’s recent anti-Iran speech to Congress had nothing to do with foreign policy, Paul Krugman opines in Monday’s column. Insulting the president is not the way to go about that. No, Netanyahu has a serious problem at home and polls suggest that he may well get the boot in Tuesday’s election. That problem might sound familiar—Israel has become almost as unequal as America, and there is widespread economic discontent in the country that once was built on the socialist ideals of the kibbutz syztem.

Economic happiness is not the usual mainstream story we hear about Israel. The country is a high-technology powerhouse and its economy has grown rapidly, barely affected by the worldwide recession starting in 2008. But the spoils of that growth have gone disproportionately to Israel’s own version of the one percent. According to Krugman, since the early 1990s,

Israel has experienced a dramatic widening of income disparities. Key measures of inequality have soared; Israel is now right up there with America as one of the most unequal societies in the advanced world. And Israel’s experience shows that this matters, that extreme inequality has a corrosive effect on social and political life.

Consider what has happened at either end of the spectrum — the growth in poverty, on one side, and extreme wealth, on the other.

According to Luxembourg Income Study data, the share of Israel’s population living on less than half the country’s median income — a widely accepted definition of relative poverty — more than doubled, to 20.5 percent from 10.2 percent, between 1992 and 2010. The share of children in poverty almost quadrupled, to 27.4 percent from 7.8 percent. Both numbers are the worst in the advanced world, by a large margin.

And when it comes to children, in particular, relative poverty is the right concept. Families that live on much lower incomes than those of their fellow citizens will, in important ways, be alienated from the society around them, unable to participate fully in the life of the nation. Children growing up in such families will surely be placed at a permanent disadvantage.

At the other end, while the available data — puzzlingly — don’t show an especially large share of income going to the top 1 percent, there is an extreme concentration of wealth and power among a tiny group of people at the top. And I mean tiny. According to the Bank of Israel, roughly 20 families control companies that account for half the total value of Israel’s stock market. The nature of that control is convoluted and obscure, working through “pyramids” in which a family controls a firm that in turn controls other firms and so on. Although the Bank of Israel is circumspect in its language, it is clearly worried about the potential this concentration of control creates for self-dealing.

The widening inequality in Israel, like that in the U.S. is the result of policy decisions, not just some naturally occurring phenomenon that free marketeers like to claim. Shockingly, according to Krugman, “Israel does less to lift people out of poverty than any other advanced country — yes, even less than the United States.” Now that is saying something. And those living in poverty are not just Israel’s oppressed Arab population and ulta-Orthodox Jews.

Israel’s oligarchs, like Russia’s, managed to gain control of businesses that were privatized in the 1980s. That control enables them outsized influence on policy. Works every time. Netanyahu is a big advocate for policies that keep them sitting pretty, and like New Jersey’s Chris Christie, the Israeli P.M. enjoys sitting and traveling pretty himself, often on the taxpayer’s dime.

There are serious signs that Israeli’s are sick of it, and that Netanyahu’s bombast in the U.S. Congress did nothing to fool them.